SlideShare a Scribd company logo
1 of 83
How Technology Is Changing Our Choices and the Values
That Help Us Make Them
Choose at Your Own Risk
How technology is changing our choices and the values
that help us make them.
By Brad Allenby
An ethical minefield.
Photo illustration by Slate. Photo by Thinkstock
You’re sitting at your desk, with the order in front of you that
will enable the use of
autonomous lethal robots in your upcoming battle. If you sign
it, you will probably save many
of your soldiers’ lives, and likely reduce civilian casualties. On
the other hand, things could go
http://www.slate.com/authors.brad_allenby.html
wrong, badly wrong, and having used the technology once, your
country will have little
ground to argue that others, perhaps less responsible, should
unleash the same technology.
What do you choose? And what values inform your choice?
We like to think we have choices in all matters, and that we
exercise values in making our
choices. This is perhaps particularly true with regard to new
technologies, from incremental
ones that build on existing products, such as new apps, to more
fundamental technologies
such as lethal autonomous robots for military and security
operations. But in reality we
understand that our choices are always significantly limited, and
that our values shift over
time in unpredictable ways. This is especially true with
emerging technologies, where values
that may lead one society to reject a technology are seldom
universal, meaning that the
technology is simply developed and deployed elsewhere. In a
world where technology is a
major source of status and power, that usually means the society
rejecting technology has, in
fact, chosen to slide down the league tables. (Europe may be
one example.)
Take choice. To say that one has a choice implies, among other
things, that one has the
power to make a selection among options, and that one
understands the implications of that
selection. Obviously, reality and existing systems significantly
bound whatever options might
be available. In 1950, I could not have chosen a mobile phone:
They didn’t exist. Today, I
can’t choose a hydrogen car: While one could certainly be built,
the infrastructure to provide
me with hydrogen so I could use that vehicle as a car in the real
world doesn’t exist. Before
the Sony Walkman, for example, no one would have said they
needed a Walkman, because
they had no idea it was an option.
Technology is changing far faster than the institutions we’ve
traditionally relied on to inform
and enforce our choices and values.
Moreover, in the real world no choice is made with full
information. This is true at the
individual level; if I could choose my stocks with full
information, I would be rich. It is also true
at the level of the firm, and governments: Investments and
policies are formulated on best
available information, because otherwise uncertainty becomes
paralysis, and that carries its
own cost.
Get Future Tense in your inbox.
Values, the principles we live by, are equally complicated. They
arise from many different
sources: one’s parents, cultural context, experiences, and
contemplation of what constitutes
meaning and the good life. Professions such as engineering,
law, the military, and medicine
all have formal and codified statements of ethics and values. In
general, values systems tend
to have two fundamental bases: absolute rules, such as the Ten
Commandments of
Christianity, or the goal of enhancing individual and social
welfare, such as utilitarianism. In
practice, most people mix and match their frameworks, adapting
the broad generalizations of
their ethical systems to the circumstances and issues of a
particular decision.
Both choices and values rely on incomplete, but adequate,
information. With inadequate
information, choice may exist, but it is meaningless, because
you don’t know enough about
what you are choosing to make a rational differentiation among
options, and because the
http://foreignpolicy.com/2014/01/24/lethal-autonomy-a-short-
history/
http://www.nspe.org/resources/ethics
http://www.americanbar.org/groups/professional_responsibility/
publications/model_rules_of_professional_conduct/model_rules
_of_professional_conduct_table_of_contents.html
http://usmilitary.about.com/cs/generalinfo/a/stanconduct.htm
http://www.ama-assn.org/ama/pub/physician-resources/medical-
ethics.page?
results of the choice are unknowable anyway. Without
information, values become exercises
in meaningless ritual. The Sixth Commandment is, after all,
thou shalt not kill, but people do
so all the time. Sometimes it is in a military operation, a form
of killing that, done within the
constraints of international law and norms, is generally
considered lawful and ethical;
sometimes it is the state responding to a criminal act with
capital punishment; sometimes it is
considered not justified and thus murder. It is the context—the
information surrounding the
situation in which the killing occurs—that determines whether
or not the absolute rule
applies. And utilitarianism obviously requires some information
about the predictable
outcomes of the decision, because otherwise how do you know
that on balance you’ve
improved the situation?
We are used to a rate of change that, even if it does make
choices and values somewhat
contingent, at least allows for reasonable stability. Today,
however, rapid and accelerating
technological change, especially in the foundational
technological systems known as the Five
Horsemen—nanotechnology, biotechnology, information and
communication technology,
robotics, and applied cognitive science—has overthrown that
stability. The cycle time of
technology innovations and the rapidity with which they ripple
through society have become
far faster than the institutions and mechanisms that we’ve
traditionally relied on to inform and
enforce our choices and values.
The results so far aren’t pretty. One popular approach, for
example, is the precautionary
principle. As stated in the U.N. World Charter for Nature, this
requires that (Section II, para.
11(b)) “proponents [of a new technology] shall demonstrate that
expected benefits outweigh
potential damage to nature, and where potential adverse effects
are not fully understood, the
activities should not proceed.” Since no one can predict the
future evolution and systemic
effects of even a relatively trivial technology, it is impossible in
any real-world situation to
understand either the adverse or positive effects of introducing
such a technology.
Accordingly, this amounts to the (unrealistic) demand that
technology simply stop. Some
activists, of course, avoid the intermediate step and simply
demand that technologies stop:
This is the approach many European greens have taken to
agricultural genetically modified
organisms and some Americans have taken toward stem cell
research. The environmental
group ETC has called for “a moratorium on the environmental
release or commercial use of
nanomaterials, a ban on self-assembling nanomaterials and on
patents on nanoscale
technologies.”
While on the surface these may appear to be supremely ethical
positions, they are arguably
simply evasions of the responsibility for ethical choice and
values. It is romantic and simple to
generate dystopian futures and then demand the world stop lest
they occur. It is far more
difficult to make the institutional and personal adjustments that
choice and values require in
the increasingly complex, rapidly changing world within which
we find ourselves. It is
romantic and simple to demand bans and similar extreme
positions given that, in a world with
many countries and cultural groups striving for dominance,
there is little if any real chance of
banning emerging technologies, especially if they provide a
significant economic,
technological, military, or security benefit.
What is needed, then, is some way to more effectively assert
meaningful choice and
responsible values regarding emerging technologies. There are
at least two ways that are
already used by institutions that face both “known unknowns”
and “unknown unknowns”:
http://www.un.org/documents/ga/res/37/a37r007.htm
http://www.etcgroup.org/issues/nanotechnology
scenarios, and real-time dialog with rapidly changing systems.
Scenarios, which are not
predictions of what might happen, are like war games: They
provide valuable experience in
adaptation, as well as opportunities to think about what might
happen. An interesting
additional form of scenario is to engage science fiction in this
regard. Good science fiction
often takes complex emerging technologies and explores their
implications in ways that
might be fictional, but can lead to much deeper understanding
of real-world systems.
Real-time dialog means that institutions that must deal with
unpredictable, complex adaptive
systems know that there is no substitute for monitoring and
responding to the changes that
are thrown up by complex systems as they occur: Responsible
and effective management
requires a process of constant interaction. Such a process is, of
course, only half the battle;
the other is to design institutions that are robust enough, but
agile enough, to use real-time
information to make necessary changes rapidly. What is
ethically responsible is not just
fixation on rules or outcomes. Rather, it is to focus on the
process and the institutions
involved by making sure that there is a transparent and
workable mechanism for observing
and understanding the technology system as it evolves, and that
relevant institutions are able
to respond to what is learned rapidly and effectively.
It is premature to say that we understand how to implement
meaningful choice and
responsible values when it comes to emerging technologies.
Indeed, much of what we do
today is naive and superficial, steeped in reflexive ideologies
and overly rigid worldviews. But
the good news is that we do know how to do better, and some of
the steps we should take. It
is, of course, a choice based on the values we hold as to whether
we do so.
On Friday, March 6, Arizona State University will host
“Emerge: The Future of Choices and
Values,” a festival of the future, in Tempe. For more
information, visit emerge.asu.edu. Future
Tense is a partnership of Slate, New America, and ASU.
http://csi.asu.edu/
http://emerge.asu.edu/
http://www.newamerica.org/
http://www.asu.edu/?feature=research
nytimes.com
Technology Changing How Students
Learn, Teachers Say
By Matt Richtel | Oct. 31st, 2012
Lisa Baldwin, a chemistry teacher, works with her students to
fight through academic challenges. Nancy Palmieri for
The New York Times
There is a widespread belief among teachers that students’
constant use of digital
technology is hampering their attention spans and ability to
persevere in the face
of challenging tasks, according to two surveys of teachers being
released on
Thursday.
The researchers note that their findings represent the subjective
views of
teachers and should not be seen as definitive proof that
widespread use of
computers, phones and video games affects students’ capability
to focus.
Even so, the researchers who performed the studies, as well as
scholars who
study technology’s impact on behavior and the brain, say the
studies are
significant because of the vantage points of teachers, who spend
hours a day
observing students.
The timing of the studies, from two well-regarded research
organizations,
appears to be coincidental.
One was conducted by the Pew Internet Project, a division of
the Pew Research
Center that focuses on technology-related research. The other
comes from
Common Sense Media, a nonprofit organization in San
Francisco that advises
parents on media use by children. It was conducted by Vicky
Rideout, a
researcher who has previously shown that media use among
children and
teenagers ages 8 to 18 has grown so fast that they on average
spend twice as much
time with screens each year as they spend in school.
Teachers who were not involved in the surveys echoed their
findings in
interviews, saying they felt they had to work harder to capture
and hold students’
attention.
“I’m an entertainer. I have to do a song and dance to capture
their attention,”
said Hope Molina-Porter, 37, an English teacher at Troy High
School in Fullerton,
Calif., who has taught for 14 years. She teaches accelerated
students, but has
noted a marked decline in the depth and analysis of their written
work.
She said she did not want to shrink from the challenge of
engaging them, nor did
other teachers interviewed, but she also worried that technology
was causing a
deeper shift in how students learned. She also wondered if
teachers were adding
to the problem by adjusting their lessons to accommodate
shorter attention
spans.
“Are we contributing to this?” Ms. Molina-Porter said. “What’s
going to happen
when they don’t have constant entertainment?”
Scholars who study the role of media in society say no long-
term studies have
been done that adequately show how and if student attention
span has changed
because of the use of digital technology. But there is mounting
indirect evidence
that constant use of technology can affect behavior, particularly
in developing
brains, because of heavy stimulation and rapid shifts in
attention.
Kristen Purcell, the associate director for research at Pew,
acknowledged that the
findings could be viewed from another perspective: that the
education system
must adjust to better accommodate the way students learn, a
point that some
teachers brought up in focus groups themselves.
“What we’re labeling as ‘distraction,’ some see as a failure of
adults to see how
these kids process information,” Ms. Purcell said. “They’re not
saying distraction
is good but that the label of ‘distraction’ is a judgment of this
generation.”
The surveys also found that many teachers said technology
could be a useful
educational tool. In the Pew survey, which was done in
conjunction with the
College Board and the National Writing Project, roughly 75
percent of 2,462
teachers surveyed said that the Internet and search engines had
a “mostly
positive” impact on student research skills. And they said such
tools had made
students more self-sufficient researchers.
But nearly 90 percent said that digital technologies were
creating “an easily
distracted generation with short attention spans.”
Hope Molina-Porter, an English teacher in Fullerton, Calif.,
worries that technology is deeply altering how students
learn. Monica Almeida/The New York Times
Similarly, of the 685 teachers surveyed in the Common Sense
project, 71 percent
said they thought technology was hurting attention span
“somewhat” or “a lot.”
About 60 percent said it hindered students’ ability to write and
communicate
face to face, and almost half said it hurt critical thinking and
their ability to do
homework.
There was little difference in how younger and older teachers
perceived the
impact of technology.
“Boy, is this a clarion call for a healthy and balanced media
diet,” said Jim Steyer,
the chief executive of Common Sense Media. He added, “What
you have to
understand as a parent is that what happens in the home with
media
consumption can affect academic achievement.”
In interviews, teachers described what might be called a
“Wikipedia problem,” in
which students have grown so accustomed to getting quick
answers with a few
keystrokes that they are more likely to give up when an easy
answer eludes them.
The Pew research found that 76 percent of teachers believed
students had been
conditioned by the Internet to find quick answers.
“They need skills that are different than ‘Spit, spit, there’s the
answer,’ ” said Lisa
Baldwin, 48, a high school teacher in Great Barrington, Mass.,
who said students’
ability to focus and fight through academic challenges was
suffering an
“exponential decline.” She said she saw the decline most
sharply in students
whose parents allowed unfettered access to television, phones,
iPads and video
games.
For her part, Ms. Baldwin said she refused to lower her
expectations or shift her
teaching style to be more entertaining. But she does spend much
more time in
individual tutoring sessions, she added, coaching students on
how to work
through challenging assignments.
Other teachers said technology was as much a solution as a
problem. Dave
Mendell, 44, a fourth-grade teacher in Wallingford, Pa., said
that educational
video games and digital presentations were excellent ways to
engage students on
their terms. Teachers also said they were using more dynamic
and flexible
teaching styles.
“I’m tap dancing all over the place,” Mr. Mendell said. “The
more I stand in front
of class, the easier it is to lose them.”
He added that it was tougher to engage students, but that once
they were
engaged, they were just as able to solve problems and be
creative as they had
been in the past. He would prefer, he added, for students to use
less
entertainment media at home, but he did not believe it
represented an
insurmountable challenge for teaching them at school.
While the Pew research explored how technology has affected
attention span, it
also looked at how the Internet has changed student research
habits. By contrast,
the Common Sense survey focused largely on how teachers saw
the impact of
entertainment media on a range of classroom skills.
The surveys include some findings that appear contradictory. In
the Common
Sense report, for instance, some teachers said that even as they
saw attention
spans wane, students were improving in subjects like math,
science and reading.
But researchers said the conflicting views could be the result of
subjectivity and
bias. For example, teachers may perceive themselves facing
both a more difficult
challenge but also believe that they are overcoming the
challenge through
effective teaching.
Pew said its research gave a “complex and at times
contradictory” picture of
teachers’ view of technology’s impact.
Dr. Dimitri Christakis, who studies the impact of technology on
the brain and is
the director of the Center for Child Health, Behavior and
Development at Seattle
Children’s Hospital, emphasized that teachers’ views were
subjective but
nevertheless could be accurate in sensing dwindling attention
spans among
students.
His own research shows what happens to attention and focus in
mice when they
undergo the equivalent of heavy digital stimulation. Students
saturated by
entertainment media, he said, were experiencing a
“supernatural” stimulation
that teachers might have to keep up with or simulate.
The heavy technology use, Dr. Christakis said, “makes reality
by comparison
uninteresting.”
A version of this article appears in print on November 1, 2012,
on Page A18 of the
New York edition with the headline: For Better and for Worse,
Technology Use
Alters Learning Styles, Teachers Say.
http://www.nytimes.com/2012/11/01/education/technology-is-
changing-how-students-learn-teachers-say.html
A World Without Work
A World Without Work
Adam Levey
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the
United States, but it is
something like a moment in history for Youngstown, Ohio, one
its residents can cite with
precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills
delivered such great prosperity that
the city was a model of the American dream, boasting a median
income and a
homeownership rate that were among the nation’s highest. But
as manufacturing shifted
abroad after World War II, Youngstown steel suffered, and on
that gray September afternoon
in 1977, Youngstown Sheet and Tube announced the shuttering
of its Campbell Works mill.
Within five years, the city lost 50,000 jobs and $1.3 billion in
manufacturing wages. The effect
was so severe that a term was coined to describe the fallout:
regional depression.
Dispatches from the Aspen Ideas Festival/Spotlight Health
Read More
Youngstown was transformed not only by an economic
disruption but also by a psychological
and cultural breakdown. Depression, spousal abuse, and suicide
all became much more
prevalent; the caseload of the area’s mental-health center tripled
within a decade. The city
built four prisons in the mid-1990s—a rare growth industry.
One of the few downtown
construction projects of that period was a museum dedicated to
the defunct steel industry.
This winter, I traveled to Ohio to consider what would happen if
technology permanently
replaced a great deal of human work. I wasn’t seeking a tour of
our automated future. I went
because Youngstown has become a national metaphor for the
decline of labor, a place where
the middle class of the 20th century has become a museum
exhibit.
Derek Thompson talks with editor in chief James Bennet about
the state of jobs in America.
“Youngstown’s story is America’s story, because it shows that
when jobs go away, the
cultural cohesion of a place is destroyed,” says John Russo, a
professor of labor studies at
Youngstown State University. “The cultural breakdown matters
even more than the economic
breakdown.”
In the past few years, even as the United States has pulled itself
partway out of the jobs hole
created by the Great Recession, some economists and
technologists have warned that the
https://www.theatlantic.com/special-report/ideas-2015
https://www.theatlantic.com/special-report/ideas-2015/
economy is near a tipping point. When they peer deeply into
labor-market data, they see
troubling signs, masked for now by a cyclical recovery. And
when they look up from their
spreadsheets, they see automation high and low—robots in the
operating room and behind
the fast-food counter. They imagine self-driving cars snaking
through the streets and Amazon
drones dotting the sky, replacing millions of drivers, warehouse
stockers, and retail workers.
They observe that the capabilities of machines—already
formidable—continue to expand
exponentially, while our own remain the same. And they
wonder: Is any job truly safe?
Futurists and science-fiction writers have at times looked
forward to machines’ workplace
takeover with a kind of giddy excitement, imagining the
banishment of drudgery and its
replacement by expansive leisure and almost limitless personal
freedom. And make no
mistake: if the capabilities of computers continue to multiply
while the price of computing
continues to decline, that will mean a great many of life’s
necessities and luxuries will become
ever cheaper, and it will mean great wealth—at least when
aggregated up to the level of the
national economy.
But even leaving aside questions of how to distribute that
wealth, the widespread
disappearance of work would usher in a social transformation
unlike any we’ve seen. If John
Russo is right, then saving work is more important than saving
any particular job.
Industriousness has served as America’s unofficial religion
since its founding. The sanctity
and preeminence of work lie at the heart of the country’s
politics, economics, and social
interactions. What might happen if work goes away?
The U.S. labor force has been shaped by millennia of
technological progress. Agricultural
technology birthed the farming industry, the industrial
revolution moved people into factories,
and then globalization and automation moved them back out,
giving rise to a nation of
services. But throughout these reshufflings, the total number of
jobs has always increased.
What may be looming is something different: an era of
technological unemployment, in which
computer scientists and software engineers essentially invent us
out of work, and the total
number of jobs declines steadily and permanently.
This fear is not new. The hope that machines might free us from
toil has always been
intertwined with the fear that they will rob us of our agency. In
the midst of the Great
Depression, the economist John Maynard Keynes forecast that
technological progress might
allow a 15-hour workweek, and abundant leisure, by 2030. But
around the same time,
President Herbert Hoover received a letter warning that
industrial technology was a
“Frankenstein monster” that threatened to upend manufacturing,
“devouring our civilization.”
(The letter came from the mayor of Palo Alto, of all places.) In
1962, President John F.
Kennedy said, “If men have the talent to invent new machines
that put men out of work, they
have the talent to put those men back to work.” But two years
later, a committee of scientists
and social activists sent an open letter to President Lyndon B.
Johnson arguing that “the
cybernation revolution” would create “a separate nation of the
poor, the unskilled, the
jobless,” who would be unable either to find work or to afford
life’s necessities.
Adam Levey
The job market defied doomsayers in those earlier times, and
according to the most
frequently reported jobs numbers, it has so far done the same in
our own time.
Unemployment is currently just over 5 percent, and 2014 was
this century’s best year for job
growth. One could be forgiven for saying that recent predictions
about technological job
displacement are merely forming the latest chapter in a long
story called The Boys Who Cried
Robot—one in which the robot, unlike the wolf, never arrives in
the end.
The end-of-work argument has often been dismissed as the
“Luddite fallacy,” an allusion to
the 19th-century British brutes who smashed textile-making
machines at the dawn of the
industrial revolution, fearing the machines would put hand-
weavers out of work. But some of
the most sober economists are beginning to worry that the
Luddites weren’t wrong, just
premature. When former Treasury Secretary Lawrence Summers
was an MIT undergraduate
in the early 1970s, many economists disdained “the stupid
people [who] thought that
automation was going to make all the jobs go away,” he said at
the National Bureau of
Economic Research Summer Institute in July 2013. “Until a few
years ago, I didn’t think this
was a very complicated subject: the Luddites were wrong, and
the believers in technology
and technological progress were right. I’m not so completely
certain now.”
2. Reasons to Cry Robot
What does the “end of work” mean, exactly? It does not mean
the imminence of total
unemployment, nor is the United States remotely likely to face,
say, 30 or 50 percent
unemployment within the next decade. Rather, technology could
exert a slow but continual
downward pressure on the value and availability of work—that
is, on wages and on the share
of prime-age workers with full-time jobs. Eventually, by
degrees, that could create a new
normal, where the expectation that work will be a central
feature of adult life dissipates for a
significant portion of society.
After 300 years of people crying wolf, there are now three
broad reasons to take seriously the
argument that the beast is at the door: the ongoing triumph of
capital over labor, the quiet
demise of the working man, and the impressive dexterity of
information technology.
• Labor’s losses. One of the first things we might expect to see
in a period of technological
displacement is the diminishment of human labor as a driver of
economic growth. In fact,
signs that this is happening have been present for quite some
time. The share of U.S.
economic output that’s paid out in wages fell steadily in the
1980s, reversed some of its
losses in the ’90s, and then continued falling after 2000,
accelerating during the Great
Recession. It now stands at its lowest level since the
government started keeping track in the
mid‑20th century.
A number of theories have been advanced to explain this
phenomenon, including
globalization and its accompanying loss of bargaining power for
some workers. But Loukas
Karabarbounis and Brent Neiman, economists at the University
of Chicago, have estimated
that almost half of the decline is the result of businesses’
replacing workers with computers
and software. In 1964, the nation’s most valuable company,
AT&T, was worth $267 billion in
today’s dollars and employed 758,611 people. Today’s
telecommunications giant, Google, is
worth $370 billion but has only about 55,000 employees—less
than a tenth the size of AT&T’s
workforce in its heyday.
The paradox of work is that many people hate their jobs, but
they are considerably
more miserable doing nothing.
• The spread of nonworking men and underemployed youth. The
share of prime-age
Americans (25 to 54 years old) who are working has been
trending down since 2000. Among
men, the decline began even earlier: the share of prime-age men
who are neither working nor
looking for work has doubled since the late 1970s, and has
increased as much throughout
the recovery as it did during the Great Recession itself. All in
all, about one in six prime-age
men today are either unemployed or out of the workforce
altogether. This is what the
economist Tyler Cowen calls “the key statistic” for
understanding the spreading rot in the
American workforce. Conventional wisdom has long held that
under normal economic
conditions, men in this age group—at the peak of their abilities
and less likely than women to
be primary caregivers for children—should almost all be
working. Yet fewer and fewer are.
Economists cannot say for certain why men are turning away
from work, but one explanation
is that technological change has helped eliminate the jobs for
which many are best suited.
Since 2000, the number of manufacturing jobs has fallen by
almost 5 million, or about 30
percent.
Young people just coming onto the job market are also
struggling—and by many measures
have been for years. Six years into the recovery, the share of
recent college grads who are
“underemployed” (in jobs that historically haven’t required a
degree) is still higher than it was
in 2007—or, for that matter, 2000. And the supply of these
“non-college jobs” is shifting away
from high-paying occupations, such as electrician, toward low-
wage service jobs, such as
waiter. More people are pursuing higher education, but the real
wages of recent college
graduates have fallen by 7.7 percent since 2000. In the biggest
picture, the job market
appears to be requiring more and more preparation for a lower
and lower starting wage. The
distorting effect of the Great Recession should make us cautious
about overinterpreting these
trends, but most began before the recession, and they do not
seem to speak encouragingly
about the future of work.
• The shrewdness of software. One common objection to the
idea that technology will
permanently displace huge numbers of workers is that new
gadgets, like self-checkout
kiosks at drugstores, have failed to fully displace their human
counterparts, like cashiers. But
employers typically take years to embrace new machines at the
expense of workers. The
robotics revolution began in factories in the 1960s and ’70s, but
manufacturing employment
kept rising until 1980, and then collapsed during the subsequent
recessions. Likewise, “the
personal computer existed in the ’80s,” says Henry Siu, an
economist at the University of
British Columbia, “but you don’t see any effect on office and
administrative-support jobs until
the 1990s, and then suddenly, in the last recession, it’s huge. So
today you’ve got checkout
screens and the promise of driverless cars, flying drones, and
little warehouse robots. We
know that these tasks can be done by machines rather than
people. But we may not see the
effect until the next recession, or the recession after that.”
Ryan Carson, the CEO of Treehouse, discusses the benefits of a
32-hour workweek.
Some observers say our humanity is a moat that machines
cannot cross. They believe
people’s capacity for compassion, deep understanding, and
creativity are inimitable. But as
Erik Brynjolfsson and Andrew McAfee have argued in their
book The Second Machine Age,
computers are so dexterous that predicting their application 10
years from now is almost
impossible. Who could have guessed in 2005, two years before
the iPhone was released, that
smartphones would threaten hotel jobs within the decade, by
helping homeowners rent out
their apartments and houses to strangers on Airbnb? Or that the
company behind the most
popular search engine would design a self-driving car that could
soon threaten driving, the
most common job occupation among American men?
In 2013, Oxford University researchers forecast that machines
might be able to perform half
of all U.S. jobs in the next two decades. The projection was
audacious, but in at least a few
cases, it probably didn’t go far enough. For example, the
authors named psychologist as one
of the occupations least likely to be “computerisable.” But some
research suggests that
people are more honest in therapy sessions when they believe
they are confessing their
troubles to a computer, because a machine can’t pass moral
judgment. Google and WebMD
already may be answering questions once reserved for one’s
therapist. This doesn’t prove
that psychologists are going the way of the textile worker.
Rather, it shows how easily
computers can encroach on areas previously considered “for
humans only.”
After 300 years of breathtaking innovation, people aren’t
massively unemployed or indentured
by machines. But to suggest how this could change, some
economists have pointed to the
defunct career of the second-most-important species in U.S.
economic history: the horse.
For many centuries, people created technologies that made the
horse more productive and
more valuable—like plows for agriculture and swords for battle.
One might have assumed
that the continuing advance of complementary technologies
would make the animal ever
more essential to farming and fighting, historically perhaps the
two most consequential
human activities. Instead came inventions that made the horse
obsolete—the tractor, the car,
and the tank. After tractors rolled onto American farms in the
early 20th century, the
population of horses and mules began to decline steeply, falling
nearly 50 percent by the
1930s and 90 percent by the 1950s.
Humans can do much more than trot, carry, and pull. But the
skills required in most offices
hardly elicit our full range of intelligence. Most jobs are still
boring, repetitive, and easily
learned. The most-common occupations in the United States are
retail salesperson, cashier,
food and beverage server, and office clerk. Together, these four
jobs employ 15.4 million
people—nearly 10 percent of the labor force, or more workers
than there are in Texas and
Massachusetts combined. Each is highly susceptible to
automation, according to the Oxford
study.
Technology creates some jobs too, but the creative half of
creative destruction is easily
overstated. Nine out of 10 workers today are in occupations that
existed 100 years ago, and
just 5 percent of the jobs generated between 1993 and 2013
came from “high tech” sectors
like computing, software, and telecommunications. Our newest
industries tend to be the
most labor-efficient: they just don’t require many people. It is
for precisely this reason that the
economic historian Robert Skidelsky, comparing the
exponential growth in computing power
with the less-than-exponential growth in job complexity, has
said, “Sooner or later, we will run
out of jobs.”
Is that certain—or certainly imminent? No. The signs so far are
murky and suggestive. The
most fundamental and wrenching job restructurings and
contractions tend to happen during
recessions: we’ll know more after the next couple of downturns.
But the possibility seems
significant enough—and the consequences disruptive enough—
that we owe it to ourselves to
start thinking about what society could look like without
universal work, in an effort to begin
nudging it toward the better outcomes and away from the worse
ones.
To paraphrase the science-fiction novelist William Gibson,
there are, perhaps, fragments of
the post-work future distributed throughout the present. I see
three overlapping possibilities
as formal employment opportunities decline. Some people
displaced from the formal
workforce will devote their freedom to simple leisure; some will
seek to build productive
communities outside the workplace; and others will fight,
passionately and in many cases
fruitlessly, to reclaim their productivity by piecing together
jobs in an informal economy.
These are futures of consumption, communal creativity, and
contingency. In any combination,
it is almost certain that the country would have to embrace a
radical new role for government.
3. Consumption: The Paradox of Leisure
Work is really three things, says Peter Frase, the author of Four
Futures, a forthcoming book
about how automation will change America: the means by which
the economy produces
goods, the means by which people earn income, and an activity
that lends meaning or
purpose to many people’s lives. “We tend to conflate these
things,” he told me, “because
today we need to pay people to keep the lights on, so to speak.
But in a future of abundance,
you wouldn’t, and we ought to think about ways to make it
easier and better to not be
employed.”
Frase belongs to a small group of writers, academics, and
economists—they have been
called “post-workists”—who welcome, even root for, the end of
labor. American society has
“an irrational belief in work for work’s sake,” says Benjamin
Hunnicutt, another post-workist
and a historian at the University of Iowa, even though most jobs
aren’t so uplifting. A 2014
Gallup report of worker satisfaction found that as many as 70
percent of Americans don’t feel
engaged by their current job. Hunnicutt told me that if a
cashier’s work were a video game—
grab an item, find the bar code, scan it, slide the item onward,
and repeat—critics of video
games might call it mindless. But when it’s a job, politicians
praise its intrinsic dignity.
“Purpose, meaning, identity, fulfillment, creativity, autonomy—
all these things that positive
psychology has shown us to be necessary for well-being are
absent in the average job,” he
said.
The post-workists are certainly right about some important
things. Paid labor does not
always map to social good. Raising children and caring for the
sick is essential work, and
these jobs are compensated poorly or not at all. In a post-work
society, Hunnicutt said,
people might spend more time caring for their families and
neighbors; pride could come from
our relationships rather than from our careers.
The post-work proponents acknowledge that, even in the best
post-work scenarios, pride
and jealousy will persevere, because reputation will always be
scarce, even in an economy of
abundance. But with the right government provisions, they
believe, the end of wage labor will
allow for a golden age of well-being. Hunnicutt said he thinks
colleges could reemerge as
cultural centers rather than job-prep institutions. The word
school, he pointed out, comes
from skholē, the Greek word for “leisure.” “We used to teach
people to be free,” he said.
“Now we teach them to work.”
Hunnicutt’s vision rests on certain assumptions about taxation
and redistribution that might
not be congenial to many Americans today. But even leaving
that aside for the moment, this
vision is problematic: it doesn’t resemble the world as it is
currently experienced by most
jobless people. By and large, the jobless don’t spend their
downtime socializing with friends
or taking up new hobbies. Instead, they watch TV or sleep.
Time-use surveys show that
jobless prime-age people dedicate some of the time once spent
working to cleaning and
childcare. But men in particular devote most of their free time
to leisure, the lion’s share of
which is spent watching television, browsing the Internet, and
sleeping. Retired seniors watch
about 50 hours of television a week, according to Nielsen. That
means they spend a majority
of their lives either sleeping or sitting on the sofa looking at a
flatscreen. The unemployed
theoretically have the most time to socialize, and yet studies
have shown that they feel the
most social isolation; it is surprisingly hard to replace the
camaraderie of the water cooler.
Most people want to work, and are miserable when they cannot.
The ills of unemployment go
well beyond the loss of income; people who lose their job are
more likely to suffer from
mental and physical ailments. “There is a loss of status, a
general malaise and
demoralization, which appears somatically or psychologically or
both,” says Ralph Catalano,
a public-health professor at UC Berkeley. Research has shown
that it is harder to recover
from a long bout of joblessness than from losing a loved one or
suffering a life-altering injury.
The very things that help many people recover from other
emotional traumas—a routine, an
absorbing distraction, a daily purpose—are not readily available
to the unemployed.
Adam Levey
The transition from labor force to leisure force would likely be
particularly hard on Americans,
the worker bees of the rich world: Between 1950 and 2012,
annual hours worked per worker
fell significantly throughout Europe—by about 40 percent in
Germany and the Netherlands—
but by only 10 percent in the United States. Richer, college-
educated Americans are working
more than they did 30 years ago, particularly when you count
time working and answering e-
mail at home.
In 1989, the psychologists Mihaly Csikszentmihalyi and Judith
LeFevre conducted a famous
study of Chicago workers that found people at work often
wished they were somewhere else.
But in questionnaires, these same workers reported feeling
better and less anxious in the
office or at the plant than they did elsewhere. The two
psychologists called this “the paradox
of work”: many people are happier complaining about jobs than
they are luxuriating in too
much leisure. Other researchers have used the term guilty couch
potato to describe people
who use media to relax but often feel worthless when they
reflect on their unproductive
downtime. Contentment speaks in the present tense, but
something more—pride—comes
only in reflection on past accomplishments.
The post-workists argue that Americans work so hard because
their culture has conditioned
them to feel guilty when they are not being productive, and that
this guilt will fade as work
ceases to be the norm. This might prove true, but it’s an
untestable hypothesis. When I asked
Hunnicutt what sort of modern community most resembles his
ideal of a post-work society,
he admitted, “I’m not sure that such a place exists.”
Less passive and more nourishing forms of mass leisure could
develop. Arguably, they
already are developing. The Internet, social media, and gaming
offer entertainments that are
as easy to slip into as is watching TV, but all are more
purposeful and often less isolating.
Video games, despite the derision aimed at them, are vehicles
for achievement of a sort.
Jeremy Bailenson, a communications professor at Stanford, says
that as virtual-reality
technology improves, people’s “cyber-existence” will become
as rich and social as their
“real” life. Games in which users climb “into another person’s
skin to embody his or her
experiences firsthand” don’t just let people live out vicarious
fantasies, he has argued, but
also “help you live as somebody else to teach you empathy and
pro-social skills.”
But it’s hard to imagine that leisure could ever entirely fill the
vacuum of accomplishment left
by the demise of labor. Most people do need to achieve things
through, yes, work to feel a
lasting sense of purpose. To envision a future that offers more
than minute-to-minute
satisfaction, we have to imagine how millions of people might
find meaningful work without
formal wages. So, inspired by the predictions of one of
America’s most famous labor
economists, I took a detour on my way to Youngstown and
stopped in Columbus, Ohio.
4. Communal Creativity: The Artisans’ Revenge
Artisans made up the original American middle class. Before
industrialization swept through
the U.S. economy, many people who didn’t work on farms were
silversmiths, blacksmiths, or
woodworkers. These artisans were ground up by the machinery
of mass production in the
20th century. But Lawrence Katz, a labor economist at Harvard,
sees the next wave of
automation returning us to an age of craftsmanship and artistry.
In particular, he looks
forward to the ramifications of 3‑D printing, whereby machines
construct complex objects
from digital designs.
The factories that arose more than a century ago “could make
Model Ts and forks and knives
and mugs and glasses in a standardized, cheap way, and that
drove the artisans out of
business,” Katz told me. “But what if the new tech, like 3-D-
printing machines, can do
customized things that are almost as cheap? It’s possible that
information technology and
robots eliminate traditional jobs and make possible a new
artisanal economy … an economy
geared around self-expression, where people would do artistic
things with their time.”
The most common jobs are salesperson, cashier, food and
beverage server, and office
clerk. Each is highly susceptible to automation.
In other words, it would be a future not of consumption but of
creativity, as technology
returns the tools of the assembly line to individuals,
democratizing the means of mass
production.
Something like this future is already present in the small but
growing number of industrial
shops called “makerspaces” that have popped up in the United
States and around the world.
The Columbus Idea Foundry is the country’s largest such space,
a cavernous converted shoe
factory stocked with industrial-age machinery. Several hundred
members pay a monthly fee
to use its arsenal of machines to make gifts and jewelry; weld,
finish, and paint; play with
plasma cutters and work an angle grinder; or operate a lathe
with a machinist.
When I arrived there on a bitterly cold afternoon in February, a
chalkboard standing on an
easel by the door displayed three arrows, pointing toward
bathrooms, pewter casting, and
zombies. Near the entrance, three men with black fingertips and
grease-stained shirts took
turns fixing a 60-year-old metal-turning lathe. Behind them, a
resident artist was tutoring an
older woman on how to transfer her photographs onto a large
canvas, while a couple of guys
fed pizza pies into a propane-fired stone oven. Elsewhere, men
in protective goggles welded
a sign for a local chicken restaurant, while others punched
codes into a computer-controlled
laser-cutting machine. Beneath the din of drilling and wood-
cutting, a Pandora rock station
hummed tinnily from a Wi‑Fi-connected Edison phonograph
horn. The foundry is not just a
gymnasium of tools. It is a social center.
Adam Levey
Alex Bandar, who started the foundry after receiving a
doctorate in materials science and
engineering, has a theory about the rhythms of invention in
American history. Over the past
century, he told me, the economy has moved from hardware to
software, from atoms to bits,
and people have spent more time at work in front of screens.
But as computers take over
more tasks previously considered the province of humans, the
pendulum will swing back
from bits to atoms, at least when it comes to how people spend
their days. Bandar thinks
that a digitally preoccupied society will come to appreciate the
pure and distinct pleasure of
making things you can touch. “I’ve always wanted to usher in a
new era of technology where
robots do our bidding,” Bandar said. “If you have better
batteries, better robotics, more
dexterous manipulation, then it’s not a far stretch to say robots
do most of the work. So what
do we do? Play? Draw? Actually talk to each other again?”
You don’t need any particular fondness for plasma cutters to see
the beauty of an economy
where tens of millions of people make things they enjoy
making—whether physical or digital,
in buildings or in online communities—and receive feedback
and appreciation for their work.
The Internet and the cheap availability of artistic tools have
already empowered millions of
people to produce culture from their living rooms. People
upload more than 400,000 hours of
YouTube videos and 350 million new Facebook photos every
day. The demise of the formal
economy could free many would-be artists, writers, and
craftspeople to dedicate their time to
creative interests—to live as cultural producers. Such activities
offer virtues that many
organizational psychologists consider central to satisfaction at
work: independence, the
chance to develop mastery, and a sense of purpose.
After touring the foundry, I sat at a long table with several
members, sharing the pizza that
had come out of the communal oven. I asked them what they
thought of their organization as
a model for a future where automation reached further into the
formal economy. A mixed-
media artist named Kate Morgan said that most people she knew
at the foundry would quit
their jobs and use the foundry to start their own business if they
could. Others spoke about
the fundamental need to witness the outcome of one’s work,
which was satisfied more
deeply by craftsmanship than by other jobs they’d held.
Late in the conversation, we were joined by Terry Griner, an
engineer who had built miniature
steam engines in his garage before Bandar invited him to join
the foundry. His fingers were
covered in soot, and he told me about the pride he had in his
ability to fix things. “I’ve been
working since I was 16. I’ve done food service, restaurant work,
hospital work, and computer
programming. I’ve done a lot of different jobs,” said Griner,
who is now a divorced father. “But
if we had a society that said, ‘We’ll cover your essentials, you
can work in the shop,’ I think
that would be utopia. That, to me, would be the best of all
possible worlds.”
5. Contingency: “You’re on Your Own”
One mile to the east of downtown Youngstown, in a brick
building surrounded by several
empty lots, is Royal Oaks, an iconic blue-collar dive. At about
5:30 p.m. on a Wednesday, the
place was nearly full. The bar glowed yellow and green from the
lights mounted along a wall.
Old beer signs, trophies, masks, and mannequins cluttered the
back corner of the main room,
like party leftovers stuffed in an attic. The scene was mostly
middle-aged men, some in
groups, talking loudly about baseball and smelling vaguely of
pot; some drank alone at the
bar, sitting quietly or listening to music on headphones. I spoke
with several patrons there
who work as musicians, artists, or handymen; many did not hold
a steady job.
“It is the end of a particular kind of wage work,” said Hannah
Woodroofe, a bartender there
who, it turns out, is also a graduate student at the University of
Chicago. (She’s writing a
dissertation on Youngstown as a harbinger of the future of
work.) A lot of people in the city
make ends meet via “post-wage arrangements,” she said,
working for tenancy or under the
table, or trading services. Places like Royal Oaks are the new
union halls: People go there not
only to relax but also to find tradespeople for particular jobs,
like auto repair. Others go to
exchange fresh vegetables, grown in urban gardens they’ve
created amid Youngstown’s
vacant lots.
When an entire area, like Youngstown, suffers from high and
prolonged unemployment,
problems caused by unemployment move beyond the personal
sphere; widespread
joblessness shatters neighborhoods and leaches away their civic
spirit. John Russo, the
Youngstown State professor, who is a co-author of a history of
the city, Steeltown USA, says
the local identity took a savage blow when residents lost the
ability to find reliable
employment. “I can’t stress this enough: this isn’t just about
economics; it’s psychological,”
he told me.
Russo sees Youngstown as the leading edge of a larger trend
toward the development of
what he calls the “precariat”—a working class that swings from
task to task in order to make
ends meet and suffers a loss of labor rights, bargaining rights,
and job security. In
Youngstown, many of these workers have by now made their
peace with insecurity and
poverty by building an identity, and some measure of pride,
around contingency. The faith
they lost in institutions—the corporations that have abandoned
the city, the police who have
failed to keep them safe—has not returned. But Russo and
Woodroofe both told me they put
stock in their own independence. And so a place that once
defined itself single-mindedly by
the steel its residents made has gradually learned to embrace the
valorization of well-rounded
resourcefulness.
Karen Schubert, a 54-year-old writer with two master’s degrees,
accepted a part-time job as
a hostess at a café in Youngstown early this year, after spending
months searching for full-
time work. Schubert, who has two grown children and an infant
grandson, said she’d loved
teaching writing and literature at the local university. But many
colleges have replaced full-
time professors with part-time adjuncts in order to control costs,
and she’d found that with
the hours she could get, adjunct teaching didn’t pay a living
wage, so she’d stopped. “I think
I would feel like a personal failure if I didn’t know that so
many Americans have their leg
caught in the same trap,” she said.
Perhaps the 20th century will strike future historians as an
aberration, with its religious
devotion to overwork in a time of prosperity.
Among Youngstown’s precariat, one can see a third possible
future, where millions of people
struggle for years to build a sense of purpose in the absence of
formal jobs, and where
entrepreneurship emerges out of necessity. But while it lacks
the comforts of the
consumption economy or the cultural richness of Lawrence
Katz’s artisanal future, it is more
complex than an outright dystopia. “There are young people
working part-time in the new
economy who feel independent, whose work and personal
relationships are contingent, and
say they like it like this—to have short hours so they have time
to focus on their passions,”
Russo said.
Schubert’s wages at the café are not enough to live on, and in
her spare time, she sells books
of her poetry at readings and organizes gatherings of the
literary-arts community in
Youngstown, where other writers (many of them also
underemployed) share their prose. The
evaporation of work has deepened the local arts and music
scene, several residents told me,
because people who are inclined toward the arts have so much
time to spend with one
another. “We’re a devastatingly poor and hemorrhaging
population, but the people who live
here are fearless and creative and phenomenal,” Schubert said.
Whether or not one has artistic ambitions as Schubert does, it is
arguably growing easier to
find short-term gigs or spot employment. Paradoxically,
technology is the reason. A
constellation of Internet-enabled companies matches available
workers with quick jobs, most
prominently including Uber (for drivers), Seamless (for meal
deliverers), Homejoy (for house
cleaners), and TaskRabbit (for just about anyone else). And
online markets like Craigslist and
eBay have likewise made it easier for people to take on small
independent projects, such as
furniture refurbishing. Although the on-demand economy is not
yet a major part of the
employment picture, the number of “temporary-help services”
workers has grown by 50
percent since 2010, according to the Bureau of Labor Statistics.
Some of these services, too, could be usurped, eventually, by
machines. But on-demand
apps also spread the work around by carving up jobs, like
driving a taxi, into hundreds of little
tasks, like a single drive, which allows more people to compete
for smaller pieces of work.
These new arrangements are already challenging the legal
definitions of employer and
employee, and there are many reasons to be ambivalent about
them. But if the future involves
a declining number of full-time jobs, as in Youngstown, then
splitting some of the remaining
work up among many part-time workers, instead of a few full-
timers, wouldn’t necessarily be
a bad development. We shouldn’t be too quick to excoriate
companies that let people
combine their work, art, and leisure in whatever ways they
choose.
Today the norm is to think about employment and
unemployment as a black-and-white
binary, rather than two points at opposite ends of a wide
spectrum of working arrangements.
As late as the mid-19th century, though, the modern concept of
“unemployment” didn’t exist
in the United States. Most people lived on farms, and while paid
work came and went, home
industry—canning, sewing, carpentry—was a constant. Even in
the worst economic panics,
people typically found productive things to do. The
despondency and helplessness of
unemployment were discovered, to the bafflement and dismay of
cultural critics, only after
factory work became dominant and cities swelled.
The 21st century, if it presents fewer full-time jobs in the
sectors that can be automated,
could in this respect come to resemble the mid-19th century: an
economy marked by
episodic work across a range of activities, the loss of any one of
which would not make
somebody suddenly idle. Many bristle that contingent gigs offer
a devil’s bargain—a bit of
additional autonomy in exchange for a larger loss of security.
But some might thrive in a
market where versatility and hustle are rewarded—where there
are, as in Youngstown, few
jobs to have, yet many things to do.
6. Government: The Visible Hand
In the 1950s, Henry Ford II, the CEO of Ford, and Walter
Reuther, the head of the United Auto
Workers union, were touring a new engine plant in Cleveland.
Ford gestured to a fleet of
machines and said, “Walter, how are you going to get these
robots to pay union dues?” The
union boss famously replied: “Henry, how are you going to get
them to buy your cars?”
As Martin Ford (no relation) writes in his new book, The Rise
of the Robots, this story might
be apocryphal, but its message is instructive. We’re pretty good
at noticing the immediate
effects of technology’s substituting for workers, such as fewer
people on the factory floor.
What’s harder is anticipating the second-order effects of this
transformation, such as what
happens to the consumer economy when you take away the
consumers.
Technological progress on the scale we’re imagining would
usher in social and cultural
changes that are almost impossible to fully envision. Consider
just how fundamentally work
has shaped America’s geography. Today’s coastal cities are a
jumble of office buildings and
residential space. Both are expensive and tightly constrained.
But the decline of work would
make many office buildings unnecessary. What might that mean
for the vibrancy of urban
areas? Would office space yield seamlessly to apartments,
allowing more people to live more
affordably in city centers and leaving the cities themselves just
as lively? Or would we see
vacant shells and spreading blight? Would big cities make sense
at all if their role as highly
sophisticated labor ecosystems were diminished? As the 40-hour
workweek faded, the idea
of a lengthy twice-daily commute would almost certainly strike
future generations as an
antiquated and baffling waste of time. But would those
generations prefer to live on streets
full of high-rises, or in smaller towns?
Today, many working parents worry that they spend too many
hours at the office. As full-time
work declined, rearing children could become less
overwhelming. And because job
opportunities historically have spurred migration in the United
States, we might see less of it;
the diaspora of extended families could give way to more
closely knitted clans. But if men
and women lost their purpose and dignity as work went away,
those families would
nonetheless be troubled.
The decline of the labor force would make our politics more
contentious. Deciding how to tax
profits and distribute income could become the most significant
economic-policy debate in
American history. In The Wealth of Nations, Adam Smith used
the term invisible hand to refer
to the order and social benefits that arise, surprisingly, from
individuals’ selfish actions. But to
preserve the consumer economy and the social fabric,
governments might have to embrace
what Haruhiko Kuroda, the governor of the Bank of Japan, has
called the visible hand of
economic intervention. What follows is an early sketch of how
it all might work.
In the near term, local governments might do well to create
more and more-ambitious
community centers or other public spaces where residents can
meet, learn skills, bond
around sports or crafts, and socialize. Two of the most common
side effects of
unemployment are loneliness, on the individual level, and the
hollowing-out of community
pride. A national policy that directed money toward centers in
distressed areas might remedy
the maladies of idleness, and form the beginnings of a long-term
experiment on how to
reengage people in their neighborhoods in the absence of full
employment.
We could also make it easier for people to start their own,
small-scale (and even part-time)
businesses. New-business formation has declined in the past few
decades in all 50 states.
One way to nurture fledgling ideas would be to build out a
network of business incubators.
Here Youngstown offers an unexpected model: its business
incubator has been recognized
internationally, and its success has brought new hope to West
Federal Street, the city’s main
drag.
Near the beginning of any broad decline in job availability, the
United States might take a
lesson from Germany on job-sharing. The German government
gives firms incentives to cut
all their workers’ hours rather than lay off some of them during
hard times. So a company
with 50 workers that might otherwise lay off 10 people instead
reduces everyone’s hours by
20 percent. Such a policy would help workers at established
firms keep their attachment to
the labor force despite the declining amount of overall labor.
Spreading work in this way has its limits. Some jobs can’t be
easily shared, and in any case,
sharing jobs wouldn’t stop labor’s pie from shrinking: it would
only apportion the slices
differently. Eventually, Washington would have to somehow
spread wealth, too.
One way of doing that would be to more heavily tax the growing
share of income going to the
owners of capital, and use the money to cut checks to all adults.
This idea—called a
“universal basic income”—has received bipartisan support in
the past. Many liberals currently
support it, and in the 1960s, Richard Nixon and the conservative
economist Milton Friedman
each proposed a version of the idea. That history
notwithstanding, the politics of universal
income in a world without universal work would be daunting.
The rich could say, with some
accuracy, that their hard work was subsidizing the idleness of
millions of “takers.” What’s
more, although a universal income might replace lost wages, it
would do little to preserve the
social benefits of work.
Oxford researchers have forecast that machines might be able to
do half of all U.S. jobs
within two decades.
The most direct solution to the latter problem would be for the
government to pay people to
do something, rather than nothing. Although this smacks of old
European socialism, or
Depression-era “makework,” it might do the most to preserve
virtues such as responsibility,
agency, and industriousness. In the 1930s, the Works Progress
Administration did more than
rebuild the nation’s infrastructure. It hired 40,000 artists and
other cultural workers to produce
music and theater, murals and paintings, state and regional
travel guides, and surveys of
state records. It’s not impossible to imagine something like the
WPA—or an effort even more
capacious—for a post-work future.
What might that look like? Several national projects might
justify direct hiring, such as caring
for a rising population of elderly people. But if the balance of
work continues to shift toward
the small-bore and episodic, the simplest way to help everybody
stay busy might be
government sponsorship of a national online marketplace of
work (or, alternatively, a series of
local ones, sponsored by local governments). Individuals could
browse for large long-term
projects, like cleaning up after a natural disaster, or small short-
term ones: an hour of
tutoring, an evening of entertainment, an art commission. The
requests could come from
local governments or community associations or nonprofit
groups; from rich families seeking
nannies or tutors; or from other individuals given some number
of credits to “spend” on the
site each year. To ensure a baseline level of attachment to the
workforce, the government
could pay adults a flat rate in return for some minimum level of
activity on the site, but people
could always earn more by taking on more gigs.
Although a digital WPA might strike some people as a strange
anachronism, it would be
similar to a federalized version of Mechanical Turk, the popular
Amazon sister site where
individuals and companies post projects of varying complexity,
while so-called Turks on the
other end browse tasks and collect money for the ones they
complete. Mechanical Turk was
designed to list tasks that cannot be performed by a computer.
(The name is an allusion to an
18th-century Austrian hoax, in which a famous automaton that
seemed to play masterful
chess concealed a human player who chose the moves and
moved the pieces.)
A government marketplace might likewise specialize in those
tasks that required empathy,
humanity, or a personal touch. By connecting millions of people
in one central hub, it might
even inspire what the technology writer Robin Sloan has called
“a Cambrian explosion of
mega-scale creative and intellectual pursuits, a generation of
Wikipedia-scale projects that
can ask their users for even deeper commitments.”
Adam Levey
There’s a case to be made for using the tools of government to
provide other incentives as
well, to help people avoid the typical traps of joblessness and
build rich lives and vibrant
communities. After all, the members of the Columbus Idea
Foundry probably weren’t born
with an innate love of lathe operation or laser-cutting.
Mastering these skills requires
discipline; discipline requires an education; and an education,
for many people, involves the
expectation that hours of often frustrating practice will
eventually prove rewarding. In a post-
work society, the financial rewards of education and training
won’t be as obvious. This is a
singular challenge of imagining a flourishing post-work society:
How will people discover their
talents, or the rewards that come from expertise, if they don’t
see much incentive to develop
either?
Modest payments to young people for attending and completing
college, skills-training
programs, or community-center workshops might eventually be
worth considering. This
seems radical, but the aim would be conservative—to preserve
the status quo of an
educated and engaged society. Whatever their career
opportunities, young people will still
grow up to be citizens, neighbors, and even, episodically,
workers. Nudges toward education
and training might be particularly beneficial to men, who are
more likely to withdraw into their
living rooms when they become unemployed.
7. Jobs and Callings
Decades from now, perhaps the 20th century will strike future
historians as an aberration,
with its religious devotion to overwork in a time of prosperity,
its attenuations of family in
service to job opportunity, its conflation of income with self-
worth. The post-work society I’ve
described holds a warped mirror up to today’s economy, but in
many ways it reflects the
forgotten norms of the mid-19th century—the artisan middle
class, the primacy of local
communities, and the unfamiliarity with widespread joblessness.
The three potential futures of consumption, communal
creativity, and contingency are not
separate paths branching out from the present. They’re likely to
intertwine and even influence
one another. Entertainment will surely become more immersive
and exert a gravitational pull
on people without much to do. But if that’s all that happens,
society will have failed. The
foundry in Columbus shows how the “third places” in people’s
lives (communities separate
from their homes and offices) could become central to growing
up, learning new skills,
discovering passions. And with or without such places, many
people will need to embrace
the resourcefulness learned over time by cities like
Youngstown, which, even if they seem like
museum exhibits of an old economy, might foretell the future
for many more cities in the next
25 years.
On my last day in Youngstown, I met with Howard Jesko, a 60-
year-old Youngstown State
graduate student, at a burger joint along the main street. A few
months after Black Friday in
1977, as a senior at Ohio State University, Jesko received a
phone call from his father, a
specialty-hose manufacturer near Youngstown. “Don’t bother
coming back here for a job,”
his dad said. “There aren’t going to be any left.” Years later,
Jesko returned to Youngstown to
work, but he recently quit his job selling products like
waterproofing systems to construction
companies; his customers had been devastated by the Great
Recession and weren’t buying
much anymore. Around the same time, a left-knee replacement
due to degenerative arthritis
resulted in a 10-day hospital stay, which gave him time to think
about the future. Jesko
decided to go back to school to become a professor. “My true
calling,” he told me, “has
always been to teach.”
One theory of work holds that people tend to see themselves in
jobs, careers, or callings.
Individuals who say their work is “just a job” emphasize that
they are working for money
rather than aligning themselves with any higher purpose. Those
with pure careerist ambitions
are focused not only on income but also on the status that comes
with promotions and the
growing renown of their peers. But one pursues a calling not
only for pay or status, but also
for the intrinsic fulfillment of the work itself.
When I think about the role that work plays in people’s self-
esteem—particularly in America—
the prospect of a no-work future seems hopeless. There is no
universal basic income that can
prevent the civic ruin of a country built on a handful of workers
permanently subsidizing the
idleness of tens of millions of people. But a future of less work
still holds a glint of hope,
because the necessity of salaried jobs now prevents so many
from seeking immersive
activities that they enjoy.
After my conversation with Jesko, I walked back to my car to
drive out of Youngstown. I
thought about Jesko’s life as it might have been had
Youngstown’s steel mills never given
way to a steel museum—had the city continued to provide
stable, predictable careers to its
residents. If Jesko had taken a job in the steel industry, he
might be preparing for retirement
today. Instead, that industry collapsed and then, years later,
another recession struck. The
outcome of this cumulative grief is that Howard Jesko is not
retiring at 60. He’s getting his
master’s degree to become a teacher. It took the loss of so many
jobs to force him to pursue
the work he always wanted to do.
Is Google Making Us Stupid?
What the Internet is doing to our brains
Like The Atlantic? Subscribe to The Atlantic Daily , our free
weekday email newsletter.
Illustration by Guy Billout
1. "Dave, stop. Stop, will you? Stop, Dave. Will you stop,
Dave?” So the supercomputer HAL pleads with
the implacable astronaut Dave Bowman in a famous and weirdly
poignant scene toward the end of
Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having
nearly been sent to a deep-space death by
the malfunctioning machine, is calmly, coldly disconnecting the
memory circuits that control its
artificial “ brain. “Dave, my mind is going,” HAL says,
forlornly. “I can feel it. I can feel it.”
2. I can feel it, too. Over the past few years I’ve had an
uncomfortable sense that someone, or something,
has been tinkering with my brain, remapping the neural
circuitry, reprogramming the memory. My mind
isn’t going—so far as I can tell—but it’s changing. I’m not
thinking the way I used to think. I can feel it
most strongly when I’m reading. Immersing myself in a book or
a lengthy article used to be easy. My
mind would get caught up in the narrative or the turns of the
argument, and I’d spend hours strolling
through long stretches of prose. That’s rarely the case anymore.
Now my concentration often starts to
drift after two or three pages. I get fidgety, lose the thread,
begin looking for something else to do. I feel
as if I’m always dragging my wayward brain back to the text.
The deep reading that used to come
naturally has become a struggle.
3. I think I know what’s going on. For more than a decade now,
I’ve been spending a lot of time online,
searching and surfing and sometimes adding to the great
databases of the Internet. The Web has been a
godsend to me as a writer. Research that once required days in
the stacks or periodical rooms of libraries
can now be done in minutes. A few Google searches, some
quick clicks on hyperlinks, and I’ve got the
telltale fact or pithy quote I was after. Even when I’m not
working, I’m as likely as not to be foraging in
the Web’s info-thickets’reading and writing e-mails, scanning
headlines and blog posts, watching videos
and listening to podcasts, or just tripping from link to link to
link. (Unlike footnotes, to which they’re
sometimes likened, hyperlinks don’t merely point to related
works; they propel you toward them.)
4. For me, as for others, the Net is becoming a universal
medium, the conduit for most of the information
that flows through my eyes and ears and into my mind. The
advantages of having immediate access to
such an incredibly rich store of information are many, and
they’ve been widely described and duly
applauded. “The perfect recall of silicon memory,” Wired’s
Clive Thompson has written, “can be an
enormous boon to thinking.” But that boon comes at a price. As
the media theorist Marshall McLuhan
pointed out in the 1960s, media are not just passive channels of
information. They supply the stuff of
thought, but they also shape the process of thought. And what
the Net seems to be doing is chipping
away my capacity for concentration and contemplation. My
mind now expects to take in information the
way the Net distributes it: in a swiftly moving stream of
particles. Once I was a scuba diver in the sea of
words. Now I zip along the surface like a guy on a Jet Ski.
5. I’m not the only one. When I mention my troubles with
reading to friends and acquaintances—literary
types, most of them—many say they’re having similar
experiences. The more they use the Web, the
more they have to fight to stay focused on long pieces of
writing. Some of the bloggers I follow have
also begun mentioning the phenomenon. Scott Karp, who writes
a blog about online media, recently
confessed that he has stopped reading books altogether. “I was a
lit major in college, and used to be [a]
voracious book reader,” he wrote. “What happened?” He
speculates on the answer: “What if I do all my
reading on the web not so much because the way I read has
changed, i.e. I’m just seeking convenience,
but because the way I THINK has changed?”
6. Bruce Friedman, who blogs regularly about the use of
computers in medicine, also has described how
the Internet has altered his mental habits. “I now have almost
totally lost the ability to read and
absorb a longish article on the web or in print,” he wrote earlier
this year. A pathologist who has long
been on the faculty of the University of Michigan Medical
School, Friedman elaborated on his comment
in a telephone conversation with me. His thinking, he said, has
taken on a “staccato” quality,
reflecting the way he quickly scans short passages of text from
many sources online. “I can’t read War
and Peace anymore,” he admitted. “I’ve lost the ability to do
that. Even a blog post of more than three
or four paragraphs is too much to absorb. I skim it.”
7. Anecdotes alone don’t prove much. And we still await the
long-term neurological and psychological
experiments that will provide a definitive picture of how
Internet use affects cognition. But a recently
published study of online research habits , conducted by
scholars from University College London,
suggests that we may well be in the midst of a sea change in the
way we read and think. As part of
the five-year research program, the scholars examined computer
logs documenting the behavior of
visitors to two popular research sites, one operated by the
British Library and one by a U.K. educational
consortium, that provide access to journal articles, e-books, and
other sources of written information.
They found that people using the sites exhibited “a form of
skimming activity,” hopping from one
source to another and rarely returning to any source they’d
already visited. They typically read no
more than one or two pages of an article or book before they
would “bounce” out to another site.
Sometimes they’d save a long article, but there’s no evidence
that they ever went back and actually read
it. The authors of the study report:
It is clear that users are not reading online in the traditional
sense; indeed there are signs that
new forms of “reading” are emerging as users “power browse”
horizontally through titles,
contents pages and abstracts going for quick wins. It almost
seems that they go online to avoid
reading in the traditional sense.
8. Thanks to the ubiquity of text on the Internet, not to mention
the popularity of text-messaging on cell
phones, we may well be reading more today than we did in the
1970s or 1980s, when television was our
medium of choice. But it’s a different kind of reading, and
behind it lies a different kind of
thinking—perhaps even a new sense of the self. “We are not
only what we read,” says Maryanne Wolf,
a developmental psychologist at Tufts University and the author
of Proust and the Squid: The Story and
Science of the Reading Brain. “We are how we read.” Wolf
worries that the style of reading
promoted by the Net, a style that puts “efficiency” and
“immediacy” above all else, may be
weakening our capacity for the kind of deep reading that
emerged when an earlier technology, the
printing press, made long and complex works of prose
commonplace. When we read online, she says,
we tend to become “mere decoders of information.” Our ability
to interpret text, to make the rich mental
connections that form when we read deeply and without
distraction, remains largely disengaged.
9. Reading, explains Wolf, is not an instinctive skill for human
beings. It’s not etched into our genes the
way speech is. We have to teach our minds how to translate the
symbolic characters we see into the
language we understand. And the media or other technologies
we use in learning and practicing the craft
of reading play an important part in shaping the neural circuits
inside our brains. Experiments
demonstrate that readers of ideograms, such as the Chinese,
develop a mental circuitry for reading that is
very different from the circuitry found in those of us whose
written language employs an alphabet. The
variations extend across many regions of the brain, including
those that govern such essential cognitive
functions as memory and the interpretation of visual and
auditory stimuli. We can expect as well that
the circuits woven by our use of the Net will be different from
those woven by our reading of
books and other printed works.
10. Sometime in 1882, Friedrich Nietzsche bought a
typewriter—a Malling-Hansen Writing Ball, to be
precise. His vision was failing, and keeping his eyes focused on
a page had become exhausting and
painful, often bringing on crushing headaches. He had been
forced to curtail his writing, and he feared
that he would soon have to give it up. The typewriter rescued
him, at least for a time. Once he had
mastered touch-typing, he was able to write with his eyes
closed, using only the tips of his fingers.
Words could once again flow from his mind to the page.
11. But the machine had a subtler effect on his work. One of
Nietzsche’s friends, a composer, noticed a
change in the style of his writing. His already terse prose had
become even tighter, more telegraphic.
“Perhaps you will through this instrument even take to a new
idiom,” the friend wrote in a letter, noting
that, in his own work, his “‘thoughts’ in music and language
often depend on the quality of pen and
paper.”
12. “You are right,” Nietzsche replied, “our writing equipment
takes part in the forming of our
thoughts.” Under the sway of the machine, writes the German
media scholar Friedrich A. Kittler ,
Nietzsche’s prose “changed from arguments to aphorisms, from
thoughts to puns, from rhetoric to
telegram style.”
13. The human brain is almost infinitely malleable. People used
to think that our mental meshwork, the
dense connections formed among the 100 billion or so neurons
inside our skulls, was largely fixed by
the time we reached adulthood. But brain researchers have
discovered that that’s not the case.
James Olds, a professor of neuroscience who directs the
Krasnow Institute for Advanced Study at
George Mason University, says that even the adult mind “is
very plastic.” Nerve cells routinely break
old connections and form new ones. “The brain,” according to
Olds, “has the ability to reprogram itself
on the fly, altering the way it functions.”
14. As we use what the sociologist Daniel Bell has called our
“intellectual technologies”—the tools that
extend our mental rather than our physical capacities—we
inevitably begin to take on the qualities of
those technologies. The mechanical clock, which came into
common use in the 14th century, provides a
compelling example. In Technics and Civilization, the historian
and cultural critic Lewis
Mumford described how the clock “disassociated time from
human events and helped create the belief
in an independent world of mathematically measurable
sequences.” The “abstract framework of divided
time” became “the point of reference for both action and
thought.”
15. The clock’s methodical ticking helped bring into being the
scientific mind and the scientific man. But it
also took something away. As the late MIT computer scientist
Joseph Weizenbaum observed in his
1976 book, Computer Power and Human Reason: From
Judgment to Calculation, the conception of the
world that emerged from the widespread use of timekeeping
instruments “remains an impoverished
version of the older one, for it rests on a rejection of those
direct experiences that formed the basis for,
and indeed constituted, the old reality.” In deciding when to eat,
to work, to sleep, to rise, we stopped
listening to our senses and started obeying the clock.
16. The process of adapting to new intellectual technologies is
reflected in the changing metaphors we use
to explain ourselves to ourselves. When the mechanical clock
arrived, people began thinking of their
brains as operating “like clockwork.” Today, in the age of
software, we have come to think of them as
operating “like computers.” But the changes, neuroscience tells
us, go much deeper than metaphor.
Thanks to our brain’s plasticity, the adaptation occurs also at a
biological level.
17. The Internet promises to have particularly far-reaching
effects on cognition. In a paper published in
1936, the British mathematician Alan Turing proved that a
digital computer, which at the time existed
only as a theoretical machine, could be programmed to perform
the function of any other information-
processing device. And that’s what we’re seeing today. The
Internet, an immeasurably powerful
computing system, is subsuming most of our other intellectual
technologies. It’s becoming our map and
our clock, our printing press and our typewriter, our calculator
and our telephone, and our radio and TV.
18. When the Net absorbs a medium, that medium is re-created
in the Net’s image. It injects the medium’s
content with hyperlinks, blinking ads, and other digital
gewgaws, and it surrounds the content with the
content of all the other media it has absorbed. A new e-mail
message, for instance, may announce its
arrival as we’re glancing over the latest headlines at a
newspaper’s site. The result is to scatter our
attention and diffuse our concentration.
19. The Net’s influence doesn’t end at the edges of a computer
screen, either. As people’s minds become
attuned to the crazy quilt of Internet media, traditional media
have to adapt to the audience’s new
expectations. Television programs add text crawls and pop-up
ads, and magazines and newspapers
shorten their articles, introduce capsule summaries, and crowd
their pages with easy-to-browse info-
snippets. When, in March of this year, TheNew York Times
decided to devote the second and third pages
of every edition to article abstracts , its design director, Tom
Bodkin, explained that the “shortcuts”
would give harried readers a quick “taste” of the day’s news,
sparing them the “less efficient” method of
actually turning the pages and reading the articles. Old media
have little choice but to play by the new-
media rules.
20. Never has a communications system played so many roles in
our lives—or exerted such broad influence
over our thoughts—as the Internet does today. Yet, for all that’s
been written about the Net, there’s been
little consideration of how, exactly, it’s reprogramming us. The
Net’s intellectual ethic remains obscure.
21. About the same time that Nietzsche started using his
typewriter, an earnest young man named Frederick
Winslow Taylor carried a stopwatch into the Midvale Steel
plant in Philadelphia and began a historic
series of experiments aimed at improving the efficiency of the
plant’s machinists. With the approval of
Midvale’s owners, he recruited a group of factory hands, set
them to work on various metalworking
machines, and recorded and timed their every movement as well
as the operations of the machines. By
breaking down every job into a sequence of small, discrete steps
and then testing different ways of
performing each one, Taylor created a set of precise
instructions—an “algorithm,” we might say today—
for how each worker should work. Midvale’s employees
grumbled about the strict new regime, claiming
that it turned them into little more than automatons, but the
factory’s productivity soared.
22. More than a hundred years after the invention of the steam
engine, the Industrial Revolution had at last
found its philosophy and its philosopher. Taylor’s tight
industrial choreography—his “system,” as he
liked to call it—was embraced by manufacturers throughout the
country and, in time, around the world.
Seeking maximum speed, maximum efficiency, and maximum
output, factory owners used time-and-
motion studies to organize their work and configure the jobs of
their workers. The goal, as Taylor
defined it in his celebrated 1911 treatise, The Principles of
Scientific Management, was to identify and
adopt, for every job, the “one best method” of work and thereby
to effect “the gradual substitution of
science for rule of thumb throughout the mechanic arts.” Once
his system was applied to all acts of
manual labor, Taylor assured his followers, it would bring about
a restructuring not only of industry but
of society, creating a utopia of perfect efficiency. “In the past
the man has been first,” he declared; “in
the future the system must be first.”
23. Taylor’s system is still very much with us; it remains the
ethic of industrial manufacturing. And now,
thanks to the growing power that computer engineers and
software coders wield over our intellectual
lives, Taylor’s ethic is beginning to govern the realm of the
mind as well. The Internet is a machine
designed for the efficient and automated collection,
transmission, and manipulation of information, and
its legions of programmers are intent on finding the “one best
method”—the perfect algorithm—to carry
out every mental movement of what we’ve come to describe as
“knowledge work.”
24. Google’s headquarters, in Mountain View, California—the
Googleplex—is the Internet’s high church,
and the religion practiced inside its walls is Taylorism. Google,
says its chief executive, Eric Schmidt, is
“a company that’s founded around the science of measurement,”
and it is striving to “systematize
everything” it does. Drawing on the terabytes of behavioral data
it collects through its search engine and
other sites, it carries out thousands of experiments a day,
according to the Harvard Business Review, and
it uses the results to refine the algorithms that increasingly
control how people find information and
extract meaning from it. What Taylor did for the work of the
hand, Google is doing for the work of the
mind.
25. The company has declared that its mission is “to organize
the world’s information and make it
universally accessible and useful.” It seeks to develop “the
perfect search engine,” which it defines as
something that “understands exactly what you mean and gives
you back exactly what you want.” In
Google’s view, information is a kind of commodity, a utilitarian
resource that can be mined and
processed with industrial efficiency. The more pieces of
information we can “access” and the faster we
can extract their gist, the more productive we become as
thinkers.
26. Where does it end? Sergey Brin and Larry Page, the gifted
young men who founded Google while
pursuing doctoral degrees in computer science at Stanford,
speak frequently of their desire to turn their
search engine into an artificial intelligence, a HAL-like machine
that might be connected directly to our
brains. “The ultimate search engine is something as smart as
people—or smarter,” Page said in a speech
a few years back. “For us, working on search is a way to work
on artificial intelligence.” In a 2004
interview with Newsweek, Brin said, “Certainly if you had all
the world’s information directly attached
to your brain, or an artificial brain that was smarter than your
brain, you’d be better off.” Last year, Page
told a convention of scientists that Google is “really trying to
build artificial intelligence and to do it on
a large scale.”
27. Such an ambition is a natural one, even an admirable one,
for a pair of math whizzes with vast quantities
of cash at their disposal and a small army of computer scientists
in their employ. A fundamentally
scientific enterprise, Google is motivated by a desire to use
technology, in Eric Schmidt’s words, “to
solve problems that have never been solved before,” and
artificial intelligence is the hardest problem out
there. Why wouldn’t Brin and Page want to be the ones to crack
it?
28. Still, their easy assumption that we’d all “be better off” if
our brains were supplemented, or even
replaced, by an artificial intelligence is unsettling. It suggests a
belief that intelligence is the output of a
mechanical process, a series of discrete steps that can be
isolated, measured, and optimized. In Google’s
world, the world we enter when we go online, there’s little
place for the fuzziness of contemplation.
Ambiguity is not an opening for insight but a bug to be fixed.
The human brain is just an outdated
computer that needs a faster processor and a bigger hard drive.
29. The idea that our minds should operate as high-speed data-
processing machines is not only built into the
workings of the Internet, it is the network’s reigning business
model as well. The faster we surf across
the Web—the more links we click and pages we view—the more
opportunities Google and other
companies gain to collect information about us and to feed us
advertisements. Most of the proprietors of
the commercial Internet have a financial stake in collecting the
crumbs of data we leave behind as we flit
from link to link—the more crumbs, the better. The last thing
these companies want is to encourage
leisurely reading or slow, concentrated thought. It’s in their
economic interest to drive us to distraction.
30. Maybe I’m just a worrywart. Just as there’s a tendency to
glorify technological progress, there’s a
countertendency to expect the worst of every new tool or
machine. In Plato’s Phaedrus, Socrates
bemoaned the development of writing. He feared that, as people
came to rely on the written word as a
substitute for the knowledge they used to carry inside their
heads, they would, in the words of one of the
dialogue’s characters, “cease to exercise their memory and
become forgetful.” And because they would
be able to “receive a quantity of information without proper
instruction,” they would “be thought very
knowledgeable when they are for the most part quite ignorant.”
They would be “filled with the conceit
of wisdom instead of real wisdom.” Socrates wasn’t wrong—the
new technology did often have the
effects he feared—but he was shortsighted. He couldn’t foresee
the many ways that writing and reading
would serve to spread information, spur fresh ideas, and expand
human knowledge (if not wisdom).
31. The arrival of Gutenberg’s printing press, in the 15th
century, set off another round of teeth gnashing.
The Italian humanist Hieronimo Squarciafico worried that the
easy availability of books would lead to
intellectual laziness, making men “less studious” and weakening
their minds. Others argued that cheaply
printed books and broadsheets would undermine religious
authority, demean the work of scholars and
scribes, and spread sedition and debauchery. As New York
University professor Clay Shirky notes,
“Most of the arguments made against the printing press were
correct, even prescient.” But, again, the
doomsayers were unable to imagine the myriad blessings that
the printed word would deliver.
32. So, yes, you should be skeptical of my skepticism. Perhaps
those who dismiss critics of the Internet as
Luddites or nostalgists will be proved correct, and from our
hyperactive, data-stoked minds will spring a
golden age of intellectual discovery and universal wisdom. Then
again, the Net isn’t the alphabet, and
although it may replace the printing press, it produces
something altogether different. The kind of deep
reading that a sequence of printed pages promotes is valuable
not just for the knowledge we acquire
from the author’s words but for the intellectual vibrations those
words set off within our own minds. In
the quiet spaces opened up by the sustained, undistracted
reading of a book, or by any other act of
contemplation, for that matter, we make our own associations,
draw our own inferences and analogies,
foster our own ideas. Deep reading, as Maryanne Wolf argues,
is indistinguishable from deep thinking.
33. If we lose those quiet spaces, or fill them up with “content,”
we will sacrifice something important not
only in our selves but in our culture. In a recent essay, the
playwright Richard Foreman eloquently
described what’s at stake:
I come from a tradition of Western culture, in which the ideal
(my ideal) was the complex, dense
and “cathedral-like” structure of the highly educated and
articulate personality—a man or
woman who carried inside themselves a personally constructed
and unique version of the entire
heritage of the West. [But now] I see within us all (myself
included) the replacement of complex
inner density with a new kind of self—evolving under the
pressure of information overload and
the technology of the “instantly available.”
34. As we are drained of our “inner repertory of dense cultural
inheritance,” Foreman concluded, we risk
turning into “‘pancake people’—spread wide and thin as we
connect with that vast network of
information accessed by the mere touch of a button.”
35. I’m haunted by that scene in 2001. What makes it so
poignant, and so weird, is the computer’s
emotional response to the disassembly of its mind: its despair as
one circuit after another goes dark, its
childlike pleading with the astronaut—“I can feel it. I can feel
it. I’m afraid”—and its final reversion to
what can only be called a state of innocence. HAL’s outpouring
of feeling contrasts with the
emotionlessness that characterizes the human figures in the
film, who go about their business with an
almost robotic efficiency. Their thoughts and actions feel
scripted, as if they’re following the steps of an
algorithm. In the world of 2001, people have become so
machinelike that the most human character
turns out to be a machine. That’s the essence of Kubrick’s dark
prophecy: as we come to rely on
computers to mediate our understanding of the world, it is our
own intelligence that flattens into
artificial intelligence.
Assignment 3
Evaluate the arguments of the following articles (in
GD>Resources>Readings>Portfolio 2 readings):
· “The Internet: Is It Changing the Way we Think?”
https://www.theguardian.com/technology/2010/aug/15/internet-
brain-neuroscience-debate
How Technology Is Changing Our Choices and the ValuesThat He.docx
How Technology Is Changing Our Choices and the ValuesThat He.docx
How Technology Is Changing Our Choices and the ValuesThat He.docx
How Technology Is Changing Our Choices and the ValuesThat He.docx
How Technology Is Changing Our Choices and the ValuesThat He.docx
How Technology Is Changing Our Choices and the ValuesThat He.docx

More Related Content

Similar to How Technology Is Changing Our Choices and the ValuesThat He.docx

Kim Solez tech&future of medicine for med students fall 2017
Kim Solez tech&future of medicine for med students fall 2017Kim Solez tech&future of medicine for med students fall 2017
Kim Solez tech&future of medicine for med students fall 2017Kim Solez ,
 
College Essay Review Services Do You Need The
College Essay Review Services Do You Need TheCollege Essay Review Services Do You Need The
College Essay Review Services Do You Need TheCarli Ferrante
 
Creon Tragic Hero In Antigone Essay
Creon Tragic Hero In Antigone EssayCreon Tragic Hero In Antigone Essay
Creon Tragic Hero In Antigone EssayAna Hall
 
Globalization Essay Introduction
Globalization Essay IntroductionGlobalization Essay Introduction
Globalization Essay IntroductionJulie Songy
 
Robots a threat_or_an_opportunity_m2_deipm
Robots a threat_or_an_opportunity_m2_deipmRobots a threat_or_an_opportunity_m2_deipm
Robots a threat_or_an_opportunity_m2_deipmAnastasia Romanski
 
London Poem Analysis- GCSE Study Flashcards,
London Poem Analysis- GCSE Study Flashcards,London Poem Analysis- GCSE Study Flashcards,
London Poem Analysis- GCSE Study Flashcards,Cassie Romero
 
1 An Introduction to Data Ethics MODULE AUTHOR1 S
1 An Introduction to Data Ethics MODULE AUTHOR1   S1 An Introduction to Data Ethics MODULE AUTHOR1   S
1 An Introduction to Data Ethics MODULE AUTHOR1 SMerrileeDelvalle969
 
IntroToCybersecurityEthics.pdf
IntroToCybersecurityEthics.pdfIntroToCybersecurityEthics.pdf
IntroToCybersecurityEthics.pdfJevieApatan
 
Kindergarten Writing Lesson Plans - Lesson Plans Lear
Kindergarten Writing Lesson Plans - Lesson Plans LearKindergarten Writing Lesson Plans - Lesson Plans Lear
Kindergarten Writing Lesson Plans - Lesson Plans LearLaura Johnson
 
Sample Compare And Contrast Essay Block Style
Sample Compare And Contrast Essay Block StyleSample Compare And Contrast Essay Block Style
Sample Compare And Contrast Essay Block StyleJennifer Reese
 
The value of being human - finding balance between the artificial and nature ...
The value of being human - finding balance between the artificial and nature ...The value of being human - finding balance between the artificial and nature ...
The value of being human - finding balance between the artificial and nature ...Salema Veliu
 
Handwriting Paper Printable - Ideas 2022
Handwriting Paper Printable - Ideas 2022Handwriting Paper Printable - Ideas 2022
Handwriting Paper Printable - Ideas 2022Jessica Valentin
 
Change Management - History and Future
Change Management - History and FutureChange Management - History and Future
Change Management - History and FutureHolger Nauheimer
 

Similar to How Technology Is Changing Our Choices and the ValuesThat He.docx (14)

Kim Solez tech&future of medicine for med students fall 2017
Kim Solez tech&future of medicine for med students fall 2017Kim Solez tech&future of medicine for med students fall 2017
Kim Solez tech&future of medicine for med students fall 2017
 
College Essay Review Services Do You Need The
College Essay Review Services Do You Need TheCollege Essay Review Services Do You Need The
College Essay Review Services Do You Need The
 
Creon Tragic Hero In Antigone Essay
Creon Tragic Hero In Antigone EssayCreon Tragic Hero In Antigone Essay
Creon Tragic Hero In Antigone Essay
 
Globalization Essay Introduction
Globalization Essay IntroductionGlobalization Essay Introduction
Globalization Essay Introduction
 
Robots a threat_or_an_opportunity_m2_deipm
Robots a threat_or_an_opportunity_m2_deipmRobots a threat_or_an_opportunity_m2_deipm
Robots a threat_or_an_opportunity_m2_deipm
 
London Poem Analysis- GCSE Study Flashcards,
London Poem Analysis- GCSE Study Flashcards,London Poem Analysis- GCSE Study Flashcards,
London Poem Analysis- GCSE Study Flashcards,
 
1 An Introduction to Data Ethics MODULE AUTHOR1 S
1 An Introduction to Data Ethics MODULE AUTHOR1   S1 An Introduction to Data Ethics MODULE AUTHOR1   S
1 An Introduction to Data Ethics MODULE AUTHOR1 S
 
IntroToCybersecurityEthics.pdf
IntroToCybersecurityEthics.pdfIntroToCybersecurityEthics.pdf
IntroToCybersecurityEthics.pdf
 
Kindergarten Writing Lesson Plans - Lesson Plans Lear
Kindergarten Writing Lesson Plans - Lesson Plans LearKindergarten Writing Lesson Plans - Lesson Plans Lear
Kindergarten Writing Lesson Plans - Lesson Plans Lear
 
Sample Compare And Contrast Essay Block Style
Sample Compare And Contrast Essay Block StyleSample Compare And Contrast Essay Block Style
Sample Compare And Contrast Essay Block Style
 
The value of being human - finding balance between the artificial and nature ...
The value of being human - finding balance between the artificial and nature ...The value of being human - finding balance between the artificial and nature ...
The value of being human - finding balance between the artificial and nature ...
 
Handwriting Paper Printable - Ideas 2022
Handwriting Paper Printable - Ideas 2022Handwriting Paper Printable - Ideas 2022
Handwriting Paper Printable - Ideas 2022
 
Change Management - History and Future
Change Management - History and FutureChange Management - History and Future
Change Management - History and Future
 
Freelance Essay Writing
Freelance Essay WritingFreelance Essay Writing
Freelance Essay Writing
 

More from wellesleyterresa

Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docx
Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docxHw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docx
Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docxwellesleyterresa
 
HW in teams of 3 studentsAn oil remanufacturing company uses c.docx
HW in teams of 3 studentsAn oil remanufacturing company uses c.docxHW in teams of 3 studentsAn oil remanufacturing company uses c.docx
HW in teams of 3 studentsAn oil remanufacturing company uses c.docxwellesleyterresa
 
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docx
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docxHW 5.docxAssignment 5 – Currency riskYou may do this assig.docx
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docxwellesleyterresa
 
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docx
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docxHW#3 – Spring 20181. Giulia is traveling from Italy to China. .docx
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docxwellesleyterresa
 
HW 2Due July 1 by 500 PM.docx
HW 2Due July 1 by 500 PM.docxHW 2Due July 1 by 500 PM.docx
HW 2Due July 1 by 500 PM.docxwellesleyterresa
 
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docx
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docxHW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docx
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docxwellesleyterresa
 
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docx
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docxHW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docx
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docxwellesleyterresa
 
HW 5-RSAascii2str.mfunction str = ascii2str(ascii) .docx
HW 5-RSAascii2str.mfunction str = ascii2str(ascii)        .docxHW 5-RSAascii2str.mfunction str = ascii2str(ascii)        .docx
HW 5-RSAascii2str.mfunction str = ascii2str(ascii) .docxwellesleyterresa
 
HW 3 Project Control• Status meeting agenda – shows time, date .docx
HW 3 Project Control• Status meeting agenda – shows time, date .docxHW 3 Project Control• Status meeting agenda – shows time, date .docx
HW 3 Project Control• Status meeting agenda – shows time, date .docxwellesleyterresa
 
HW 1January 19 2017Due back Jan 26, in class.1. (T.docx
HW 1January 19 2017Due back Jan 26, in class.1. (T.docxHW 1January 19 2017Due back Jan 26, in class.1. (T.docx
HW 1January 19 2017Due back Jan 26, in class.1. (T.docxwellesleyterresa
 
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docx
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docxHussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docx
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docxwellesleyterresa
 
hw1.docxCS 211 Homework #1Please complete the homework problem.docx
hw1.docxCS 211 Homework #1Please complete the homework problem.docxhw1.docxCS 211 Homework #1Please complete the homework problem.docx
hw1.docxCS 211 Homework #1Please complete the homework problem.docxwellesleyterresa
 
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docx
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docxHUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docx
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docxwellesleyterresa
 
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docx
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docxHW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docx
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docxwellesleyterresa
 
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docx
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docxHW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docx
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docxwellesleyterresa
 
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docx
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docxHunters Son Dialogue Activity1. Please write 1-2 sentences for e.docx
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docxwellesleyterresa
 
HW 2 - SQL The database you will use for this assignme.docx
HW 2 - SQL   The database you will use for this assignme.docxHW 2 - SQL   The database you will use for this assignme.docx
HW 2 - SQL The database you will use for this assignme.docxwellesleyterresa
 
Humanities Commons Learning Goals1. Write about primary and seco.docx
Humanities Commons Learning Goals1. Write about primary and seco.docxHumanities Commons Learning Goals1. Write about primary and seco.docx
Humanities Commons Learning Goals1. Write about primary and seco.docxwellesleyterresa
 
HURRICANE KATRINA A NATION STILL UNPREPARED .docx
HURRICANE KATRINA  A NATION STILL UNPREPARED   .docxHURRICANE KATRINA  A NATION STILL UNPREPARED   .docx
HURRICANE KATRINA A NATION STILL UNPREPARED .docxwellesleyterresa
 
Humanities 115Short Essay Grading CriteriaExcellentPassing.docx
Humanities 115Short Essay Grading CriteriaExcellentPassing.docxHumanities 115Short Essay Grading CriteriaExcellentPassing.docx
Humanities 115Short Essay Grading CriteriaExcellentPassing.docxwellesleyterresa
 

More from wellesleyterresa (20)

Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docx
Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docxHw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docx
Hw059f6dbf-250a-4d74-8f5e-f28f14227edc.jpg__MACOSXHw._059.docx
 
HW in teams of 3 studentsAn oil remanufacturing company uses c.docx
HW in teams of 3 studentsAn oil remanufacturing company uses c.docxHW in teams of 3 studentsAn oil remanufacturing company uses c.docx
HW in teams of 3 studentsAn oil remanufacturing company uses c.docx
 
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docx
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docxHW 5.docxAssignment 5 – Currency riskYou may do this assig.docx
HW 5.docxAssignment 5 – Currency riskYou may do this assig.docx
 
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docx
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docxHW#3 – Spring 20181. Giulia is traveling from Italy to China. .docx
HW#3 – Spring 20181. Giulia is traveling from Italy to China. .docx
 
HW 2Due July 1 by 500 PM.docx
HW 2Due July 1 by 500 PM.docxHW 2Due July 1 by 500 PM.docx
HW 2Due July 1 by 500 PM.docx
 
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docx
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docxHW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docx
HW 4 Gung Ho Commentary DUE Thursday, April 20 at 505 PM on.docx
 
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docx
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docxHW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docx
HW 5 Math 405. Due beginning of class – Monday, 10 Oct 2016.docx
 
HW 5-RSAascii2str.mfunction str = ascii2str(ascii) .docx
HW 5-RSAascii2str.mfunction str = ascii2str(ascii)        .docxHW 5-RSAascii2str.mfunction str = ascii2str(ascii)        .docx
HW 5-RSAascii2str.mfunction str = ascii2str(ascii) .docx
 
HW 3 Project Control• Status meeting agenda – shows time, date .docx
HW 3 Project Control• Status meeting agenda – shows time, date .docxHW 3 Project Control• Status meeting agenda – shows time, date .docx
HW 3 Project Control• Status meeting agenda – shows time, date .docx
 
HW 1January 19 2017Due back Jan 26, in class.1. (T.docx
HW 1January 19 2017Due back Jan 26, in class.1. (T.docxHW 1January 19 2017Due back Jan 26, in class.1. (T.docx
HW 1January 19 2017Due back Jan 26, in class.1. (T.docx
 
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docx
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docxHussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docx
Hussam Malibari Heckman MAT 242 Spring 2017Assignment Chapte.docx
 
hw1.docxCS 211 Homework #1Please complete the homework problem.docx
hw1.docxCS 211 Homework #1Please complete the homework problem.docxhw1.docxCS 211 Homework #1Please complete the homework problem.docx
hw1.docxCS 211 Homework #1Please complete the homework problem.docx
 
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docx
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docxHUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docx
HUS 335 Interpersonal Helping SkillsCase Assessment FormatT.docx
 
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docx
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docxHW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docx
HW #1Tech Alert on IT & Strategy (Ch 3-5Ch 3 -5 IT Strategy opt.docx
 
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docx
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docxHW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docx
HW 2 (1) Visit Monsanto (httpwww.monsanto.com) again and Goog.docx
 
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docx
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docxHunters Son Dialogue Activity1. Please write 1-2 sentences for e.docx
Hunters Son Dialogue Activity1. Please write 1-2 sentences for e.docx
 
HW 2 - SQL The database you will use for this assignme.docx
HW 2 - SQL   The database you will use for this assignme.docxHW 2 - SQL   The database you will use for this assignme.docx
HW 2 - SQL The database you will use for this assignme.docx
 
Humanities Commons Learning Goals1. Write about primary and seco.docx
Humanities Commons Learning Goals1. Write about primary and seco.docxHumanities Commons Learning Goals1. Write about primary and seco.docx
Humanities Commons Learning Goals1. Write about primary and seco.docx
 
HURRICANE KATRINA A NATION STILL UNPREPARED .docx
HURRICANE KATRINA  A NATION STILL UNPREPARED   .docxHURRICANE KATRINA  A NATION STILL UNPREPARED   .docx
HURRICANE KATRINA A NATION STILL UNPREPARED .docx
 
Humanities 115Short Essay Grading CriteriaExcellentPassing.docx
Humanities 115Short Essay Grading CriteriaExcellentPassing.docxHumanities 115Short Essay Grading CriteriaExcellentPassing.docx
Humanities 115Short Essay Grading CriteriaExcellentPassing.docx
 

Recently uploaded

Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...Sapna Thakur
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 

Recently uploaded (20)

Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 

How Technology Is Changing Our Choices and the ValuesThat He.docx

  • 1. How Technology Is Changing Our Choices and the Values That Help Us Make Them Choose at Your Own Risk How technology is changing our choices and the values that help us make them. By Brad Allenby An ethical minefield. Photo illustration by Slate. Photo by Thinkstock You’re sitting at your desk, with the order in front of you that will enable the use of autonomous lethal robots in your upcoming battle. If you sign it, you will probably save many of your soldiers’ lives, and likely reduce civilian casualties. On the other hand, things could go http://www.slate.com/authors.brad_allenby.html wrong, badly wrong, and having used the technology once, your country will have little ground to argue that others, perhaps less responsible, should unleash the same technology. What do you choose? And what values inform your choice? We like to think we have choices in all matters, and that we exercise values in making our choices. This is perhaps particularly true with regard to new technologies, from incremental
  • 2. ones that build on existing products, such as new apps, to more fundamental technologies such as lethal autonomous robots for military and security operations. But in reality we understand that our choices are always significantly limited, and that our values shift over time in unpredictable ways. This is especially true with emerging technologies, where values that may lead one society to reject a technology are seldom universal, meaning that the technology is simply developed and deployed elsewhere. In a world where technology is a major source of status and power, that usually means the society rejecting technology has, in fact, chosen to slide down the league tables. (Europe may be one example.) Take choice. To say that one has a choice implies, among other things, that one has the power to make a selection among options, and that one understands the implications of that selection. Obviously, reality and existing systems significantly bound whatever options might be available. In 1950, I could not have chosen a mobile phone: They didn’t exist. Today, I can’t choose a hydrogen car: While one could certainly be built, the infrastructure to provide me with hydrogen so I could use that vehicle as a car in the real world doesn’t exist. Before the Sony Walkman, for example, no one would have said they needed a Walkman, because they had no idea it was an option. Technology is changing far faster than the institutions we’ve traditionally relied on to inform and enforce our choices and values.
  • 3. Moreover, in the real world no choice is made with full information. This is true at the individual level; if I could choose my stocks with full information, I would be rich. It is also true at the level of the firm, and governments: Investments and policies are formulated on best available information, because otherwise uncertainty becomes paralysis, and that carries its own cost. Get Future Tense in your inbox. Values, the principles we live by, are equally complicated. They arise from many different sources: one’s parents, cultural context, experiences, and contemplation of what constitutes meaning and the good life. Professions such as engineering, law, the military, and medicine all have formal and codified statements of ethics and values. In general, values systems tend to have two fundamental bases: absolute rules, such as the Ten Commandments of Christianity, or the goal of enhancing individual and social welfare, such as utilitarianism. In practice, most people mix and match their frameworks, adapting the broad generalizations of their ethical systems to the circumstances and issues of a particular decision. Both choices and values rely on incomplete, but adequate, information. With inadequate information, choice may exist, but it is meaningless, because you don’t know enough about what you are choosing to make a rational differentiation among options, and because the
  • 4. http://foreignpolicy.com/2014/01/24/lethal-autonomy-a-short- history/ http://www.nspe.org/resources/ethics http://www.americanbar.org/groups/professional_responsibility/ publications/model_rules_of_professional_conduct/model_rules _of_professional_conduct_table_of_contents.html http://usmilitary.about.com/cs/generalinfo/a/stanconduct.htm http://www.ama-assn.org/ama/pub/physician-resources/medical- ethics.page? results of the choice are unknowable anyway. Without information, values become exercises in meaningless ritual. The Sixth Commandment is, after all, thou shalt not kill, but people do so all the time. Sometimes it is in a military operation, a form of killing that, done within the constraints of international law and norms, is generally considered lawful and ethical; sometimes it is the state responding to a criminal act with capital punishment; sometimes it is considered not justified and thus murder. It is the context—the information surrounding the situation in which the killing occurs—that determines whether or not the absolute rule applies. And utilitarianism obviously requires some information about the predictable outcomes of the decision, because otherwise how do you know that on balance you’ve improved the situation? We are used to a rate of change that, even if it does make choices and values somewhat contingent, at least allows for reasonable stability. Today, however, rapid and accelerating technological change, especially in the foundational
  • 5. technological systems known as the Five Horsemen—nanotechnology, biotechnology, information and communication technology, robotics, and applied cognitive science—has overthrown that stability. The cycle time of technology innovations and the rapidity with which they ripple through society have become far faster than the institutions and mechanisms that we’ve traditionally relied on to inform and enforce our choices and values. The results so far aren’t pretty. One popular approach, for example, is the precautionary principle. As stated in the U.N. World Charter for Nature, this requires that (Section II, para. 11(b)) “proponents [of a new technology] shall demonstrate that expected benefits outweigh potential damage to nature, and where potential adverse effects are not fully understood, the activities should not proceed.” Since no one can predict the future evolution and systemic effects of even a relatively trivial technology, it is impossible in any real-world situation to understand either the adverse or positive effects of introducing such a technology. Accordingly, this amounts to the (unrealistic) demand that technology simply stop. Some activists, of course, avoid the intermediate step and simply demand that technologies stop: This is the approach many European greens have taken to agricultural genetically modified organisms and some Americans have taken toward stem cell research. The environmental group ETC has called for “a moratorium on the environmental release or commercial use of nanomaterials, a ban on self-assembling nanomaterials and on
  • 6. patents on nanoscale technologies.” While on the surface these may appear to be supremely ethical positions, they are arguably simply evasions of the responsibility for ethical choice and values. It is romantic and simple to generate dystopian futures and then demand the world stop lest they occur. It is far more difficult to make the institutional and personal adjustments that choice and values require in the increasingly complex, rapidly changing world within which we find ourselves. It is romantic and simple to demand bans and similar extreme positions given that, in a world with many countries and cultural groups striving for dominance, there is little if any real chance of banning emerging technologies, especially if they provide a significant economic, technological, military, or security benefit. What is needed, then, is some way to more effectively assert meaningful choice and responsible values regarding emerging technologies. There are at least two ways that are already used by institutions that face both “known unknowns” and “unknown unknowns”: http://www.un.org/documents/ga/res/37/a37r007.htm http://www.etcgroup.org/issues/nanotechnology scenarios, and real-time dialog with rapidly changing systems. Scenarios, which are not predictions of what might happen, are like war games: They provide valuable experience in
  • 7. adaptation, as well as opportunities to think about what might happen. An interesting additional form of scenario is to engage science fiction in this regard. Good science fiction often takes complex emerging technologies and explores their implications in ways that might be fictional, but can lead to much deeper understanding of real-world systems. Real-time dialog means that institutions that must deal with unpredictable, complex adaptive systems know that there is no substitute for monitoring and responding to the changes that are thrown up by complex systems as they occur: Responsible and effective management requires a process of constant interaction. Such a process is, of course, only half the battle; the other is to design institutions that are robust enough, but agile enough, to use real-time information to make necessary changes rapidly. What is ethically responsible is not just fixation on rules or outcomes. Rather, it is to focus on the process and the institutions involved by making sure that there is a transparent and workable mechanism for observing and understanding the technology system as it evolves, and that relevant institutions are able to respond to what is learned rapidly and effectively. It is premature to say that we understand how to implement meaningful choice and responsible values when it comes to emerging technologies. Indeed, much of what we do today is naive and superficial, steeped in reflexive ideologies and overly rigid worldviews. But the good news is that we do know how to do better, and some of
  • 8. the steps we should take. It is, of course, a choice based on the values we hold as to whether we do so. On Friday, March 6, Arizona State University will host “Emerge: The Future of Choices and Values,” a festival of the future, in Tempe. For more information, visit emerge.asu.edu. Future Tense is a partnership of Slate, New America, and ASU. http://csi.asu.edu/ http://emerge.asu.edu/ http://www.newamerica.org/ http://www.asu.edu/?feature=research nytimes.com Technology Changing How Students Learn, Teachers Say By Matt Richtel | Oct. 31st, 2012 Lisa Baldwin, a chemistry teacher, works with her students to fight through academic challenges. Nancy Palmieri for The New York Times There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to
  • 9. persevere in the face of challenging tasks, according to two surveys of teachers being released on Thursday. The researchers note that their findings represent the subjective views of teachers and should not be seen as definitive proof that widespread use of computers, phones and video games affects students’ capability to focus. Even so, the researchers who performed the studies, as well as scholars who study technology’s impact on behavior and the brain, say the studies are significant because of the vantage points of teachers, who spend hours a day observing students. The timing of the studies, from two well-regarded research organizations, appears to be coincidental. One was conducted by the Pew Internet Project, a division of the Pew Research Center that focuses on technology-related research. The other
  • 10. comes from Common Sense Media, a nonprofit organization in San Francisco that advises parents on media use by children. It was conducted by Vicky Rideout, a researcher who has previously shown that media use among children and teenagers ages 8 to 18 has grown so fast that they on average spend twice as much time with screens each year as they spend in school. Teachers who were not involved in the surveys echoed their findings in interviews, saying they felt they had to work harder to capture and hold students’ attention. “I’m an entertainer. I have to do a song and dance to capture their attention,” said Hope Molina-Porter, 37, an English teacher at Troy High School in Fullerton, Calif., who has taught for 14 years. She teaches accelerated students, but has noted a marked decline in the depth and analysis of their written
  • 11. work. She said she did not want to shrink from the challenge of engaging them, nor did other teachers interviewed, but she also worried that technology was causing a deeper shift in how students learned. She also wondered if teachers were adding to the problem by adjusting their lessons to accommodate shorter attention spans. “Are we contributing to this?” Ms. Molina-Porter said. “What’s going to happen when they don’t have constant entertainment?” Scholars who study the role of media in society say no long- term studies have been done that adequately show how and if student attention span has changed because of the use of digital technology. But there is mounting indirect evidence that constant use of technology can affect behavior, particularly in developing brains, because of heavy stimulation and rapid shifts in attention.
  • 12. Kristen Purcell, the associate director for research at Pew, acknowledged that the findings could be viewed from another perspective: that the education system must adjust to better accommodate the way students learn, a point that some teachers brought up in focus groups themselves. “What we’re labeling as ‘distraction,’ some see as a failure of adults to see how these kids process information,” Ms. Purcell said. “They’re not saying distraction is good but that the label of ‘distraction’ is a judgment of this generation.” The surveys also found that many teachers said technology could be a useful educational tool. In the Pew survey, which was done in conjunction with the College Board and the National Writing Project, roughly 75 percent of 2,462 teachers surveyed said that the Internet and search engines had a “mostly positive” impact on student research skills. And they said such tools had made
  • 13. students more self-sufficient researchers. But nearly 90 percent said that digital technologies were creating “an easily distracted generation with short attention spans.” Hope Molina-Porter, an English teacher in Fullerton, Calif., worries that technology is deeply altering how students learn. Monica Almeida/The New York Times Similarly, of the 685 teachers surveyed in the Common Sense project, 71 percent said they thought technology was hurting attention span “somewhat” or “a lot.” About 60 percent said it hindered students’ ability to write and communicate face to face, and almost half said it hurt critical thinking and their ability to do homework. There was little difference in how younger and older teachers perceived the impact of technology.
  • 14. “Boy, is this a clarion call for a healthy and balanced media diet,” said Jim Steyer, the chief executive of Common Sense Media. He added, “What you have to understand as a parent is that what happens in the home with media consumption can affect academic achievement.” In interviews, teachers described what might be called a “Wikipedia problem,” in which students have grown so accustomed to getting quick answers with a few keystrokes that they are more likely to give up when an easy answer eludes them. The Pew research found that 76 percent of teachers believed students had been conditioned by the Internet to find quick answers. “They need skills that are different than ‘Spit, spit, there’s the answer,’ ” said Lisa Baldwin, 48, a high school teacher in Great Barrington, Mass., who said students’ ability to focus and fight through academic challenges was suffering an “exponential decline.” She said she saw the decline most sharply in students
  • 15. whose parents allowed unfettered access to television, phones, iPads and video games. For her part, Ms. Baldwin said she refused to lower her expectations or shift her teaching style to be more entertaining. But she does spend much more time in individual tutoring sessions, she added, coaching students on how to work through challenging assignments. Other teachers said technology was as much a solution as a problem. Dave Mendell, 44, a fourth-grade teacher in Wallingford, Pa., said that educational video games and digital presentations were excellent ways to engage students on their terms. Teachers also said they were using more dynamic and flexible teaching styles. “I’m tap dancing all over the place,” Mr. Mendell said. “The more I stand in front
  • 16. of class, the easier it is to lose them.” He added that it was tougher to engage students, but that once they were engaged, they were just as able to solve problems and be creative as they had been in the past. He would prefer, he added, for students to use less entertainment media at home, but he did not believe it represented an insurmountable challenge for teaching them at school. While the Pew research explored how technology has affected attention span, it also looked at how the Internet has changed student research habits. By contrast, the Common Sense survey focused largely on how teachers saw the impact of entertainment media on a range of classroom skills. The surveys include some findings that appear contradictory. In the Common Sense report, for instance, some teachers said that even as they saw attention spans wane, students were improving in subjects like math,
  • 17. science and reading. But researchers said the conflicting views could be the result of subjectivity and bias. For example, teachers may perceive themselves facing both a more difficult challenge but also believe that they are overcoming the challenge through effective teaching. Pew said its research gave a “complex and at times contradictory” picture of teachers’ view of technology’s impact. Dr. Dimitri Christakis, who studies the impact of technology on the brain and is the director of the Center for Child Health, Behavior and Development at Seattle Children’s Hospital, emphasized that teachers’ views were subjective but nevertheless could be accurate in sensing dwindling attention spans among students. His own research shows what happens to attention and focus in mice when they undergo the equivalent of heavy digital stimulation. Students
  • 18. saturated by entertainment media, he said, were experiencing a “supernatural” stimulation that teachers might have to keep up with or simulate. The heavy technology use, Dr. Christakis said, “makes reality by comparison uninteresting.” A version of this article appears in print on November 1, 2012, on Page A18 of the New York edition with the headline: For Better and for Worse, Technology Use Alters Learning Styles, Teachers Say. http://www.nytimes.com/2012/11/01/education/technology-is- changing-how-students-learn-teachers-say.html A World Without Work A World Without Work Adam Levey
  • 19. 1. Youngstown, U.S.A. The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977. For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression. Dispatches from the Aspen Ideas Festival/Spotlight Health Read More Youngstown was transformed not only by an economic disruption but also by a psychological and cultural breakdown. Depression, spousal abuse, and suicide all became much more prevalent; the caseload of the area’s mental-health center tripled within a decade. The city built four prisons in the mid-1990s—a rare growth industry. One of the few downtown construction projects of that period was a museum dedicated to the defunct steel industry.
  • 20. This winter, I traveled to Ohio to consider what would happen if technology permanently replaced a great deal of human work. I wasn’t seeking a tour of our automated future. I went because Youngstown has become a national metaphor for the decline of labor, a place where the middle class of the 20th century has become a museum exhibit. Derek Thompson talks with editor in chief James Bennet about the state of jobs in America. “Youngstown’s story is America’s story, because it shows that when jobs go away, the cultural cohesion of a place is destroyed,” says John Russo, a professor of labor studies at Youngstown State University. “The cultural breakdown matters even more than the economic breakdown.” In the past few years, even as the United States has pulled itself partway out of the jobs hole created by the Great Recession, some economists and technologists have warned that the https://www.theatlantic.com/special-report/ideas-2015 https://www.theatlantic.com/special-report/ideas-2015/ economy is near a tipping point. When they peer deeply into labor-market data, they see troubling signs, masked for now by a cyclical recovery. And when they look up from their spreadsheets, they see automation high and low—robots in the operating room and behind
  • 21. the fast-food counter. They imagine self-driving cars snaking through the streets and Amazon drones dotting the sky, replacing millions of drivers, warehouse stockers, and retail workers. They observe that the capabilities of machines—already formidable—continue to expand exponentially, while our own remain the same. And they wonder: Is any job truly safe? Futurists and science-fiction writers have at times looked forward to machines’ workplace takeover with a kind of giddy excitement, imagining the banishment of drudgery and its replacement by expansive leisure and almost limitless personal freedom. And make no mistake: if the capabilities of computers continue to multiply while the price of computing continues to decline, that will mean a great many of life’s necessities and luxuries will become ever cheaper, and it will mean great wealth—at least when aggregated up to the level of the national economy. But even leaving aside questions of how to distribute that wealth, the widespread disappearance of work would usher in a social transformation unlike any we’ve seen. If John Russo is right, then saving work is more important than saving any particular job. Industriousness has served as America’s unofficial religion since its founding. The sanctity and preeminence of work lie at the heart of the country’s politics, economics, and social interactions. What might happen if work goes away? The U.S. labor force has been shaped by millennia of
  • 22. technological progress. Agricultural technology birthed the farming industry, the industrial revolution moved people into factories, and then globalization and automation moved them back out, giving rise to a nation of services. But throughout these reshufflings, the total number of jobs has always increased. What may be looming is something different: an era of technological unemployment, in which computer scientists and software engineers essentially invent us out of work, and the total number of jobs declines steadily and permanently. This fear is not new. The hope that machines might free us from toil has always been intertwined with the fear that they will rob us of our agency. In the midst of the Great Depression, the economist John Maynard Keynes forecast that technological progress might allow a 15-hour workweek, and abundant leisure, by 2030. But around the same time, President Herbert Hoover received a letter warning that industrial technology was a “Frankenstein monster” that threatened to upend manufacturing, “devouring our civilization.” (The letter came from the mayor of Palo Alto, of all places.) In 1962, President John F. Kennedy said, “If men have the talent to invent new machines that put men out of work, they have the talent to put those men back to work.” But two years later, a committee of scientists and social activists sent an open letter to President Lyndon B. Johnson arguing that “the cybernation revolution” would create “a separate nation of the poor, the unskilled, the jobless,” who would be unable either to find work or to afford
  • 23. life’s necessities. Adam Levey The job market defied doomsayers in those earlier times, and according to the most frequently reported jobs numbers, it has so far done the same in our own time. Unemployment is currently just over 5 percent, and 2014 was this century’s best year for job growth. One could be forgiven for saying that recent predictions about technological job displacement are merely forming the latest chapter in a long story called The Boys Who Cried Robot—one in which the robot, unlike the wolf, never arrives in the end. The end-of-work argument has often been dismissed as the “Luddite fallacy,” an allusion to the 19th-century British brutes who smashed textile-making machines at the dawn of the industrial revolution, fearing the machines would put hand- weavers out of work. But some of the most sober economists are beginning to worry that the Luddites weren’t wrong, just premature. When former Treasury Secretary Lawrence Summers was an MIT undergraduate in the early 1970s, many economists disdained “the stupid people [who] thought that automation was going to make all the jobs go away,” he said at the National Bureau of Economic Research Summer Institute in July 2013. “Until a few years ago, I didn’t think this
  • 24. was a very complicated subject: the Luddites were wrong, and the believers in technology and technological progress were right. I’m not so completely certain now.” 2. Reasons to Cry Robot What does the “end of work” mean, exactly? It does not mean the imminence of total unemployment, nor is the United States remotely likely to face, say, 30 or 50 percent unemployment within the next decade. Rather, technology could exert a slow but continual downward pressure on the value and availability of work—that is, on wages and on the share of prime-age workers with full-time jobs. Eventually, by degrees, that could create a new normal, where the expectation that work will be a central feature of adult life dissipates for a significant portion of society. After 300 years of people crying wolf, there are now three broad reasons to take seriously the argument that the beast is at the door: the ongoing triumph of capital over labor, the quiet demise of the working man, and the impressive dexterity of information technology. • Labor’s losses. One of the first things we might expect to see in a period of technological displacement is the diminishment of human labor as a driver of economic growth. In fact, signs that this is happening have been present for quite some time. The share of U.S. economic output that’s paid out in wages fell steadily in the 1980s, reversed some of its
  • 25. losses in the ’90s, and then continued falling after 2000, accelerating during the Great Recession. It now stands at its lowest level since the government started keeping track in the mid‑20th century. A number of theories have been advanced to explain this phenomenon, including globalization and its accompanying loss of bargaining power for some workers. But Loukas Karabarbounis and Brent Neiman, economists at the University of Chicago, have estimated that almost half of the decline is the result of businesses’ replacing workers with computers and software. In 1964, the nation’s most valuable company, AT&T, was worth $267 billion in today’s dollars and employed 758,611 people. Today’s telecommunications giant, Google, is worth $370 billion but has only about 55,000 employees—less than a tenth the size of AT&T’s workforce in its heyday. The paradox of work is that many people hate their jobs, but they are considerably more miserable doing nothing. • The spread of nonworking men and underemployed youth. The share of prime-age Americans (25 to 54 years old) who are working has been trending down since 2000. Among men, the decline began even earlier: the share of prime-age men who are neither working nor looking for work has doubled since the late 1970s, and has increased as much throughout
  • 26. the recovery as it did during the Great Recession itself. All in all, about one in six prime-age men today are either unemployed or out of the workforce altogether. This is what the economist Tyler Cowen calls “the key statistic” for understanding the spreading rot in the American workforce. Conventional wisdom has long held that under normal economic conditions, men in this age group—at the peak of their abilities and less likely than women to be primary caregivers for children—should almost all be working. Yet fewer and fewer are. Economists cannot say for certain why men are turning away from work, but one explanation is that technological change has helped eliminate the jobs for which many are best suited. Since 2000, the number of manufacturing jobs has fallen by almost 5 million, or about 30 percent. Young people just coming onto the job market are also struggling—and by many measures have been for years. Six years into the recovery, the share of recent college grads who are “underemployed” (in jobs that historically haven’t required a degree) is still higher than it was in 2007—or, for that matter, 2000. And the supply of these “non-college jobs” is shifting away from high-paying occupations, such as electrician, toward low- wage service jobs, such as waiter. More people are pursuing higher education, but the real wages of recent college graduates have fallen by 7.7 percent since 2000. In the biggest picture, the job market appears to be requiring more and more preparation for a lower
  • 27. and lower starting wage. The distorting effect of the Great Recession should make us cautious about overinterpreting these trends, but most began before the recession, and they do not seem to speak encouragingly about the future of work. • The shrewdness of software. One common objection to the idea that technology will permanently displace huge numbers of workers is that new gadgets, like self-checkout kiosks at drugstores, have failed to fully displace their human counterparts, like cashiers. But employers typically take years to embrace new machines at the expense of workers. The robotics revolution began in factories in the 1960s and ’70s, but manufacturing employment kept rising until 1980, and then collapsed during the subsequent recessions. Likewise, “the personal computer existed in the ’80s,” says Henry Siu, an economist at the University of British Columbia, “but you don’t see any effect on office and administrative-support jobs until the 1990s, and then suddenly, in the last recession, it’s huge. So today you’ve got checkout screens and the promise of driverless cars, flying drones, and little warehouse robots. We know that these tasks can be done by machines rather than people. But we may not see the effect until the next recession, or the recession after that.” Ryan Carson, the CEO of Treehouse, discusses the benefits of a 32-hour workweek. Some observers say our humanity is a moat that machines cannot cross. They believe
  • 28. people’s capacity for compassion, deep understanding, and creativity are inimitable. But as Erik Brynjolfsson and Andrew McAfee have argued in their book The Second Machine Age, computers are so dexterous that predicting their application 10 years from now is almost impossible. Who could have guessed in 2005, two years before the iPhone was released, that smartphones would threaten hotel jobs within the decade, by helping homeowners rent out their apartments and houses to strangers on Airbnb? Or that the company behind the most popular search engine would design a self-driving car that could soon threaten driving, the most common job occupation among American men? In 2013, Oxford University researchers forecast that machines might be able to perform half of all U.S. jobs in the next two decades. The projection was audacious, but in at least a few cases, it probably didn’t go far enough. For example, the authors named psychologist as one of the occupations least likely to be “computerisable.” But some research suggests that people are more honest in therapy sessions when they believe they are confessing their troubles to a computer, because a machine can’t pass moral judgment. Google and WebMD already may be answering questions once reserved for one’s therapist. This doesn’t prove that psychologists are going the way of the textile worker. Rather, it shows how easily computers can encroach on areas previously considered “for
  • 29. humans only.” After 300 years of breathtaking innovation, people aren’t massively unemployed or indentured by machines. But to suggest how this could change, some economists have pointed to the defunct career of the second-most-important species in U.S. economic history: the horse. For many centuries, people created technologies that made the horse more productive and more valuable—like plows for agriculture and swords for battle. One might have assumed that the continuing advance of complementary technologies would make the animal ever more essential to farming and fighting, historically perhaps the two most consequential human activities. Instead came inventions that made the horse obsolete—the tractor, the car, and the tank. After tractors rolled onto American farms in the early 20th century, the population of horses and mules began to decline steeply, falling nearly 50 percent by the 1930s and 90 percent by the 1950s. Humans can do much more than trot, carry, and pull. But the skills required in most offices hardly elicit our full range of intelligence. Most jobs are still boring, repetitive, and easily learned. The most-common occupations in the United States are retail salesperson, cashier, food and beverage server, and office clerk. Together, these four jobs employ 15.4 million people—nearly 10 percent of the labor force, or more workers than there are in Texas and Massachusetts combined. Each is highly susceptible to
  • 30. automation, according to the Oxford study. Technology creates some jobs too, but the creative half of creative destruction is easily overstated. Nine out of 10 workers today are in occupations that existed 100 years ago, and just 5 percent of the jobs generated between 1993 and 2013 came from “high tech” sectors like computing, software, and telecommunications. Our newest industries tend to be the most labor-efficient: they just don’t require many people. It is for precisely this reason that the economic historian Robert Skidelsky, comparing the exponential growth in computing power with the less-than-exponential growth in job complexity, has said, “Sooner or later, we will run out of jobs.” Is that certain—or certainly imminent? No. The signs so far are murky and suggestive. The most fundamental and wrenching job restructurings and contractions tend to happen during recessions: we’ll know more after the next couple of downturns. But the possibility seems significant enough—and the consequences disruptive enough— that we owe it to ourselves to start thinking about what society could look like without universal work, in an effort to begin nudging it toward the better outcomes and away from the worse ones. To paraphrase the science-fiction novelist William Gibson, there are, perhaps, fragments of
  • 31. the post-work future distributed throughout the present. I see three overlapping possibilities as formal employment opportunities decline. Some people displaced from the formal workforce will devote their freedom to simple leisure; some will seek to build productive communities outside the workplace; and others will fight, passionately and in many cases fruitlessly, to reclaim their productivity by piecing together jobs in an informal economy. These are futures of consumption, communal creativity, and contingency. In any combination, it is almost certain that the country would have to embrace a radical new role for government. 3. Consumption: The Paradox of Leisure Work is really three things, says Peter Frase, the author of Four Futures, a forthcoming book about how automation will change America: the means by which the economy produces goods, the means by which people earn income, and an activity that lends meaning or purpose to many people’s lives. “We tend to conflate these things,” he told me, “because today we need to pay people to keep the lights on, so to speak. But in a future of abundance, you wouldn’t, and we ought to think about ways to make it easier and better to not be employed.” Frase belongs to a small group of writers, academics, and economists—they have been called “post-workists”—who welcome, even root for, the end of labor. American society has “an irrational belief in work for work’s sake,” says Benjamin
  • 32. Hunnicutt, another post-workist and a historian at the University of Iowa, even though most jobs aren’t so uplifting. A 2014 Gallup report of worker satisfaction found that as many as 70 percent of Americans don’t feel engaged by their current job. Hunnicutt told me that if a cashier’s work were a video game— grab an item, find the bar code, scan it, slide the item onward, and repeat—critics of video games might call it mindless. But when it’s a job, politicians praise its intrinsic dignity. “Purpose, meaning, identity, fulfillment, creativity, autonomy— all these things that positive psychology has shown us to be necessary for well-being are absent in the average job,” he said. The post-workists are certainly right about some important things. Paid labor does not always map to social good. Raising children and caring for the sick is essential work, and these jobs are compensated poorly or not at all. In a post-work society, Hunnicutt said, people might spend more time caring for their families and neighbors; pride could come from our relationships rather than from our careers. The post-work proponents acknowledge that, even in the best post-work scenarios, pride and jealousy will persevere, because reputation will always be scarce, even in an economy of abundance. But with the right government provisions, they believe, the end of wage labor will allow for a golden age of well-being. Hunnicutt said he thinks colleges could reemerge as
  • 33. cultural centers rather than job-prep institutions. The word school, he pointed out, comes from skholē, the Greek word for “leisure.” “We used to teach people to be free,” he said. “Now we teach them to work.” Hunnicutt’s vision rests on certain assumptions about taxation and redistribution that might not be congenial to many Americans today. But even leaving that aside for the moment, this vision is problematic: it doesn’t resemble the world as it is currently experienced by most jobless people. By and large, the jobless don’t spend their downtime socializing with friends or taking up new hobbies. Instead, they watch TV or sleep. Time-use surveys show that jobless prime-age people dedicate some of the time once spent working to cleaning and childcare. But men in particular devote most of their free time to leisure, the lion’s share of which is spent watching television, browsing the Internet, and sleeping. Retired seniors watch about 50 hours of television a week, according to Nielsen. That means they spend a majority of their lives either sleeping or sitting on the sofa looking at a flatscreen. The unemployed theoretically have the most time to socialize, and yet studies have shown that they feel the most social isolation; it is surprisingly hard to replace the camaraderie of the water cooler. Most people want to work, and are miserable when they cannot. The ills of unemployment go well beyond the loss of income; people who lose their job are
  • 34. more likely to suffer from mental and physical ailments. “There is a loss of status, a general malaise and demoralization, which appears somatically or psychologically or both,” says Ralph Catalano, a public-health professor at UC Berkeley. Research has shown that it is harder to recover from a long bout of joblessness than from losing a loved one or suffering a life-altering injury. The very things that help many people recover from other emotional traumas—a routine, an absorbing distraction, a daily purpose—are not readily available to the unemployed. Adam Levey The transition from labor force to leisure force would likely be particularly hard on Americans, the worker bees of the rich world: Between 1950 and 2012, annual hours worked per worker fell significantly throughout Europe—by about 40 percent in Germany and the Netherlands— but by only 10 percent in the United States. Richer, college- educated Americans are working more than they did 30 years ago, particularly when you count time working and answering e- mail at home. In 1989, the psychologists Mihaly Csikszentmihalyi and Judith LeFevre conducted a famous study of Chicago workers that found people at work often wished they were somewhere else. But in questionnaires, these same workers reported feeling better and less anxious in the office or at the plant than they did elsewhere. The two psychologists called this “the paradox
  • 35. of work”: many people are happier complaining about jobs than they are luxuriating in too much leisure. Other researchers have used the term guilty couch potato to describe people who use media to relax but often feel worthless when they reflect on their unproductive downtime. Contentment speaks in the present tense, but something more—pride—comes only in reflection on past accomplishments. The post-workists argue that Americans work so hard because their culture has conditioned them to feel guilty when they are not being productive, and that this guilt will fade as work ceases to be the norm. This might prove true, but it’s an untestable hypothesis. When I asked Hunnicutt what sort of modern community most resembles his ideal of a post-work society, he admitted, “I’m not sure that such a place exists.” Less passive and more nourishing forms of mass leisure could develop. Arguably, they already are developing. The Internet, social media, and gaming offer entertainments that are as easy to slip into as is watching TV, but all are more purposeful and often less isolating. Video games, despite the derision aimed at them, are vehicles for achievement of a sort. Jeremy Bailenson, a communications professor at Stanford, says that as virtual-reality technology improves, people’s “cyber-existence” will become as rich and social as their “real” life. Games in which users climb “into another person’s
  • 36. skin to embody his or her experiences firsthand” don’t just let people live out vicarious fantasies, he has argued, but also “help you live as somebody else to teach you empathy and pro-social skills.” But it’s hard to imagine that leisure could ever entirely fill the vacuum of accomplishment left by the demise of labor. Most people do need to achieve things through, yes, work to feel a lasting sense of purpose. To envision a future that offers more than minute-to-minute satisfaction, we have to imagine how millions of people might find meaningful work without formal wages. So, inspired by the predictions of one of America’s most famous labor economists, I took a detour on my way to Youngstown and stopped in Columbus, Ohio. 4. Communal Creativity: The Artisans’ Revenge Artisans made up the original American middle class. Before industrialization swept through the U.S. economy, many people who didn’t work on farms were silversmiths, blacksmiths, or woodworkers. These artisans were ground up by the machinery of mass production in the 20th century. But Lawrence Katz, a labor economist at Harvard, sees the next wave of automation returning us to an age of craftsmanship and artistry. In particular, he looks forward to the ramifications of 3‑D printing, whereby machines construct complex objects from digital designs. The factories that arose more than a century ago “could make
  • 37. Model Ts and forks and knives and mugs and glasses in a standardized, cheap way, and that drove the artisans out of business,” Katz told me. “But what if the new tech, like 3-D- printing machines, can do customized things that are almost as cheap? It’s possible that information technology and robots eliminate traditional jobs and make possible a new artisanal economy … an economy geared around self-expression, where people would do artistic things with their time.” The most common jobs are salesperson, cashier, food and beverage server, and office clerk. Each is highly susceptible to automation. In other words, it would be a future not of consumption but of creativity, as technology returns the tools of the assembly line to individuals, democratizing the means of mass production. Something like this future is already present in the small but growing number of industrial shops called “makerspaces” that have popped up in the United States and around the world. The Columbus Idea Foundry is the country’s largest such space, a cavernous converted shoe factory stocked with industrial-age machinery. Several hundred members pay a monthly fee to use its arsenal of machines to make gifts and jewelry; weld, finish, and paint; play with plasma cutters and work an angle grinder; or operate a lathe
  • 38. with a machinist. When I arrived there on a bitterly cold afternoon in February, a chalkboard standing on an easel by the door displayed three arrows, pointing toward bathrooms, pewter casting, and zombies. Near the entrance, three men with black fingertips and grease-stained shirts took turns fixing a 60-year-old metal-turning lathe. Behind them, a resident artist was tutoring an older woman on how to transfer her photographs onto a large canvas, while a couple of guys fed pizza pies into a propane-fired stone oven. Elsewhere, men in protective goggles welded a sign for a local chicken restaurant, while others punched codes into a computer-controlled laser-cutting machine. Beneath the din of drilling and wood- cutting, a Pandora rock station hummed tinnily from a Wi‑Fi-connected Edison phonograph horn. The foundry is not just a gymnasium of tools. It is a social center. Adam Levey Alex Bandar, who started the foundry after receiving a doctorate in materials science and engineering, has a theory about the rhythms of invention in American history. Over the past century, he told me, the economy has moved from hardware to software, from atoms to bits, and people have spent more time at work in front of screens. But as computers take over more tasks previously considered the province of humans, the pendulum will swing back from bits to atoms, at least when it comes to how people spend their days. Bandar thinks
  • 39. that a digitally preoccupied society will come to appreciate the pure and distinct pleasure of making things you can touch. “I’ve always wanted to usher in a new era of technology where robots do our bidding,” Bandar said. “If you have better batteries, better robotics, more dexterous manipulation, then it’s not a far stretch to say robots do most of the work. So what do we do? Play? Draw? Actually talk to each other again?” You don’t need any particular fondness for plasma cutters to see the beauty of an economy where tens of millions of people make things they enjoy making—whether physical or digital, in buildings or in online communities—and receive feedback and appreciation for their work. The Internet and the cheap availability of artistic tools have already empowered millions of people to produce culture from their living rooms. People upload more than 400,000 hours of YouTube videos and 350 million new Facebook photos every day. The demise of the formal economy could free many would-be artists, writers, and craftspeople to dedicate their time to creative interests—to live as cultural producers. Such activities offer virtues that many organizational psychologists consider central to satisfaction at work: independence, the chance to develop mastery, and a sense of purpose. After touring the foundry, I sat at a long table with several members, sharing the pizza that had come out of the communal oven. I asked them what they thought of their organization as a model for a future where automation reached further into the formal economy. A mixed-
  • 40. media artist named Kate Morgan said that most people she knew at the foundry would quit their jobs and use the foundry to start their own business if they could. Others spoke about the fundamental need to witness the outcome of one’s work, which was satisfied more deeply by craftsmanship than by other jobs they’d held. Late in the conversation, we were joined by Terry Griner, an engineer who had built miniature steam engines in his garage before Bandar invited him to join the foundry. His fingers were covered in soot, and he told me about the pride he had in his ability to fix things. “I’ve been working since I was 16. I’ve done food service, restaurant work, hospital work, and computer programming. I’ve done a lot of different jobs,” said Griner, who is now a divorced father. “But if we had a society that said, ‘We’ll cover your essentials, you can work in the shop,’ I think that would be utopia. That, to me, would be the best of all possible worlds.” 5. Contingency: “You’re on Your Own” One mile to the east of downtown Youngstown, in a brick building surrounded by several empty lots, is Royal Oaks, an iconic blue-collar dive. At about 5:30 p.m. on a Wednesday, the place was nearly full. The bar glowed yellow and green from the lights mounted along a wall. Old beer signs, trophies, masks, and mannequins cluttered the back corner of the main room, like party leftovers stuffed in an attic. The scene was mostly
  • 41. middle-aged men, some in groups, talking loudly about baseball and smelling vaguely of pot; some drank alone at the bar, sitting quietly or listening to music on headphones. I spoke with several patrons there who work as musicians, artists, or handymen; many did not hold a steady job. “It is the end of a particular kind of wage work,” said Hannah Woodroofe, a bartender there who, it turns out, is also a graduate student at the University of Chicago. (She’s writing a dissertation on Youngstown as a harbinger of the future of work.) A lot of people in the city make ends meet via “post-wage arrangements,” she said, working for tenancy or under the table, or trading services. Places like Royal Oaks are the new union halls: People go there not only to relax but also to find tradespeople for particular jobs, like auto repair. Others go to exchange fresh vegetables, grown in urban gardens they’ve created amid Youngstown’s vacant lots. When an entire area, like Youngstown, suffers from high and prolonged unemployment, problems caused by unemployment move beyond the personal sphere; widespread joblessness shatters neighborhoods and leaches away their civic spirit. John Russo, the Youngstown State professor, who is a co-author of a history of the city, Steeltown USA, says the local identity took a savage blow when residents lost the ability to find reliable employment. “I can’t stress this enough: this isn’t just about economics; it’s psychological,”
  • 42. he told me. Russo sees Youngstown as the leading edge of a larger trend toward the development of what he calls the “precariat”—a working class that swings from task to task in order to make ends meet and suffers a loss of labor rights, bargaining rights, and job security. In Youngstown, many of these workers have by now made their peace with insecurity and poverty by building an identity, and some measure of pride, around contingency. The faith they lost in institutions—the corporations that have abandoned the city, the police who have failed to keep them safe—has not returned. But Russo and Woodroofe both told me they put stock in their own independence. And so a place that once defined itself single-mindedly by the steel its residents made has gradually learned to embrace the valorization of well-rounded resourcefulness. Karen Schubert, a 54-year-old writer with two master’s degrees, accepted a part-time job as a hostess at a café in Youngstown early this year, after spending months searching for full- time work. Schubert, who has two grown children and an infant grandson, said she’d loved teaching writing and literature at the local university. But many colleges have replaced full- time professors with part-time adjuncts in order to control costs, and she’d found that with the hours she could get, adjunct teaching didn’t pay a living
  • 43. wage, so she’d stopped. “I think I would feel like a personal failure if I didn’t know that so many Americans have their leg caught in the same trap,” she said. Perhaps the 20th century will strike future historians as an aberration, with its religious devotion to overwork in a time of prosperity. Among Youngstown’s precariat, one can see a third possible future, where millions of people struggle for years to build a sense of purpose in the absence of formal jobs, and where entrepreneurship emerges out of necessity. But while it lacks the comforts of the consumption economy or the cultural richness of Lawrence Katz’s artisanal future, it is more complex than an outright dystopia. “There are young people working part-time in the new economy who feel independent, whose work and personal relationships are contingent, and say they like it like this—to have short hours so they have time to focus on their passions,” Russo said. Schubert’s wages at the café are not enough to live on, and in her spare time, she sells books of her poetry at readings and organizes gatherings of the literary-arts community in Youngstown, where other writers (many of them also underemployed) share their prose. The evaporation of work has deepened the local arts and music scene, several residents told me, because people who are inclined toward the arts have so much time to spend with one another. “We’re a devastatingly poor and hemorrhaging
  • 44. population, but the people who live here are fearless and creative and phenomenal,” Schubert said. Whether or not one has artistic ambitions as Schubert does, it is arguably growing easier to find short-term gigs or spot employment. Paradoxically, technology is the reason. A constellation of Internet-enabled companies matches available workers with quick jobs, most prominently including Uber (for drivers), Seamless (for meal deliverers), Homejoy (for house cleaners), and TaskRabbit (for just about anyone else). And online markets like Craigslist and eBay have likewise made it easier for people to take on small independent projects, such as furniture refurbishing. Although the on-demand economy is not yet a major part of the employment picture, the number of “temporary-help services” workers has grown by 50 percent since 2010, according to the Bureau of Labor Statistics. Some of these services, too, could be usurped, eventually, by machines. But on-demand apps also spread the work around by carving up jobs, like driving a taxi, into hundreds of little tasks, like a single drive, which allows more people to compete for smaller pieces of work. These new arrangements are already challenging the legal definitions of employer and employee, and there are many reasons to be ambivalent about them. But if the future involves a declining number of full-time jobs, as in Youngstown, then splitting some of the remaining work up among many part-time workers, instead of a few full- timers, wouldn’t necessarily be a bad development. We shouldn’t be too quick to excoriate
  • 45. companies that let people combine their work, art, and leisure in whatever ways they choose. Today the norm is to think about employment and unemployment as a black-and-white binary, rather than two points at opposite ends of a wide spectrum of working arrangements. As late as the mid-19th century, though, the modern concept of “unemployment” didn’t exist in the United States. Most people lived on farms, and while paid work came and went, home industry—canning, sewing, carpentry—was a constant. Even in the worst economic panics, people typically found productive things to do. The despondency and helplessness of unemployment were discovered, to the bafflement and dismay of cultural critics, only after factory work became dominant and cities swelled. The 21st century, if it presents fewer full-time jobs in the sectors that can be automated, could in this respect come to resemble the mid-19th century: an economy marked by episodic work across a range of activities, the loss of any one of which would not make somebody suddenly idle. Many bristle that contingent gigs offer a devil’s bargain—a bit of additional autonomy in exchange for a larger loss of security. But some might thrive in a market where versatility and hustle are rewarded—where there are, as in Youngstown, few jobs to have, yet many things to do.
  • 46. 6. Government: The Visible Hand In the 1950s, Henry Ford II, the CEO of Ford, and Walter Reuther, the head of the United Auto Workers union, were touring a new engine plant in Cleveland. Ford gestured to a fleet of machines and said, “Walter, how are you going to get these robots to pay union dues?” The union boss famously replied: “Henry, how are you going to get them to buy your cars?” As Martin Ford (no relation) writes in his new book, The Rise of the Robots, this story might be apocryphal, but its message is instructive. We’re pretty good at noticing the immediate effects of technology’s substituting for workers, such as fewer people on the factory floor. What’s harder is anticipating the second-order effects of this transformation, such as what happens to the consumer economy when you take away the consumers. Technological progress on the scale we’re imagining would usher in social and cultural changes that are almost impossible to fully envision. Consider just how fundamentally work has shaped America’s geography. Today’s coastal cities are a jumble of office buildings and residential space. Both are expensive and tightly constrained. But the decline of work would make many office buildings unnecessary. What might that mean for the vibrancy of urban areas? Would office space yield seamlessly to apartments, allowing more people to live more affordably in city centers and leaving the cities themselves just as lively? Or would we see
  • 47. vacant shells and spreading blight? Would big cities make sense at all if their role as highly sophisticated labor ecosystems were diminished? As the 40-hour workweek faded, the idea of a lengthy twice-daily commute would almost certainly strike future generations as an antiquated and baffling waste of time. But would those generations prefer to live on streets full of high-rises, or in smaller towns? Today, many working parents worry that they spend too many hours at the office. As full-time work declined, rearing children could become less overwhelming. And because job opportunities historically have spurred migration in the United States, we might see less of it; the diaspora of extended families could give way to more closely knitted clans. But if men and women lost their purpose and dignity as work went away, those families would nonetheless be troubled. The decline of the labor force would make our politics more contentious. Deciding how to tax profits and distribute income could become the most significant economic-policy debate in American history. In The Wealth of Nations, Adam Smith used the term invisible hand to refer to the order and social benefits that arise, surprisingly, from individuals’ selfish actions. But to preserve the consumer economy and the social fabric, governments might have to embrace what Haruhiko Kuroda, the governor of the Bank of Japan, has
  • 48. called the visible hand of economic intervention. What follows is an early sketch of how it all might work. In the near term, local governments might do well to create more and more-ambitious community centers or other public spaces where residents can meet, learn skills, bond around sports or crafts, and socialize. Two of the most common side effects of unemployment are loneliness, on the individual level, and the hollowing-out of community pride. A national policy that directed money toward centers in distressed areas might remedy the maladies of idleness, and form the beginnings of a long-term experiment on how to reengage people in their neighborhoods in the absence of full employment. We could also make it easier for people to start their own, small-scale (and even part-time) businesses. New-business formation has declined in the past few decades in all 50 states. One way to nurture fledgling ideas would be to build out a network of business incubators. Here Youngstown offers an unexpected model: its business incubator has been recognized internationally, and its success has brought new hope to West Federal Street, the city’s main drag. Near the beginning of any broad decline in job availability, the United States might take a lesson from Germany on job-sharing. The German government gives firms incentives to cut all their workers’ hours rather than lay off some of them during
  • 49. hard times. So a company with 50 workers that might otherwise lay off 10 people instead reduces everyone’s hours by 20 percent. Such a policy would help workers at established firms keep their attachment to the labor force despite the declining amount of overall labor. Spreading work in this way has its limits. Some jobs can’t be easily shared, and in any case, sharing jobs wouldn’t stop labor’s pie from shrinking: it would only apportion the slices differently. Eventually, Washington would have to somehow spread wealth, too. One way of doing that would be to more heavily tax the growing share of income going to the owners of capital, and use the money to cut checks to all adults. This idea—called a “universal basic income”—has received bipartisan support in the past. Many liberals currently support it, and in the 1960s, Richard Nixon and the conservative economist Milton Friedman each proposed a version of the idea. That history notwithstanding, the politics of universal income in a world without universal work would be daunting. The rich could say, with some accuracy, that their hard work was subsidizing the idleness of millions of “takers.” What’s more, although a universal income might replace lost wages, it would do little to preserve the social benefits of work. Oxford researchers have forecast that machines might be able to do half of all U.S. jobs within two decades.
  • 50. The most direct solution to the latter problem would be for the government to pay people to do something, rather than nothing. Although this smacks of old European socialism, or Depression-era “makework,” it might do the most to preserve virtues such as responsibility, agency, and industriousness. In the 1930s, the Works Progress Administration did more than rebuild the nation’s infrastructure. It hired 40,000 artists and other cultural workers to produce music and theater, murals and paintings, state and regional travel guides, and surveys of state records. It’s not impossible to imagine something like the WPA—or an effort even more capacious—for a post-work future. What might that look like? Several national projects might justify direct hiring, such as caring for a rising population of elderly people. But if the balance of work continues to shift toward the small-bore and episodic, the simplest way to help everybody stay busy might be government sponsorship of a national online marketplace of work (or, alternatively, a series of local ones, sponsored by local governments). Individuals could browse for large long-term projects, like cleaning up after a natural disaster, or small short- term ones: an hour of tutoring, an evening of entertainment, an art commission. The requests could come from local governments or community associations or nonprofit groups; from rich families seeking nannies or tutors; or from other individuals given some number of credits to “spend” on the
  • 51. site each year. To ensure a baseline level of attachment to the workforce, the government could pay adults a flat rate in return for some minimum level of activity on the site, but people could always earn more by taking on more gigs. Although a digital WPA might strike some people as a strange anachronism, it would be similar to a federalized version of Mechanical Turk, the popular Amazon sister site where individuals and companies post projects of varying complexity, while so-called Turks on the other end browse tasks and collect money for the ones they complete. Mechanical Turk was designed to list tasks that cannot be performed by a computer. (The name is an allusion to an 18th-century Austrian hoax, in which a famous automaton that seemed to play masterful chess concealed a human player who chose the moves and moved the pieces.) A government marketplace might likewise specialize in those tasks that required empathy, humanity, or a personal touch. By connecting millions of people in one central hub, it might even inspire what the technology writer Robin Sloan has called “a Cambrian explosion of mega-scale creative and intellectual pursuits, a generation of Wikipedia-scale projects that can ask their users for even deeper commitments.” Adam Levey There’s a case to be made for using the tools of government to provide other incentives as well, to help people avoid the typical traps of joblessness and
  • 52. build rich lives and vibrant communities. After all, the members of the Columbus Idea Foundry probably weren’t born with an innate love of lathe operation or laser-cutting. Mastering these skills requires discipline; discipline requires an education; and an education, for many people, involves the expectation that hours of often frustrating practice will eventually prove rewarding. In a post- work society, the financial rewards of education and training won’t be as obvious. This is a singular challenge of imagining a flourishing post-work society: How will people discover their talents, or the rewards that come from expertise, if they don’t see much incentive to develop either? Modest payments to young people for attending and completing college, skills-training programs, or community-center workshops might eventually be worth considering. This seems radical, but the aim would be conservative—to preserve the status quo of an educated and engaged society. Whatever their career opportunities, young people will still grow up to be citizens, neighbors, and even, episodically, workers. Nudges toward education and training might be particularly beneficial to men, who are more likely to withdraw into their living rooms when they become unemployed. 7. Jobs and Callings
  • 53. Decades from now, perhaps the 20th century will strike future historians as an aberration, with its religious devotion to overwork in a time of prosperity, its attenuations of family in service to job opportunity, its conflation of income with self- worth. The post-work society I’ve described holds a warped mirror up to today’s economy, but in many ways it reflects the forgotten norms of the mid-19th century—the artisan middle class, the primacy of local communities, and the unfamiliarity with widespread joblessness. The three potential futures of consumption, communal creativity, and contingency are not separate paths branching out from the present. They’re likely to intertwine and even influence one another. Entertainment will surely become more immersive and exert a gravitational pull on people without much to do. But if that’s all that happens, society will have failed. The foundry in Columbus shows how the “third places” in people’s lives (communities separate from their homes and offices) could become central to growing up, learning new skills, discovering passions. And with or without such places, many people will need to embrace the resourcefulness learned over time by cities like Youngstown, which, even if they seem like museum exhibits of an old economy, might foretell the future for many more cities in the next 25 years. On my last day in Youngstown, I met with Howard Jesko, a 60- year-old Youngstown State graduate student, at a burger joint along the main street. A few months after Black Friday in
  • 54. 1977, as a senior at Ohio State University, Jesko received a phone call from his father, a specialty-hose manufacturer near Youngstown. “Don’t bother coming back here for a job,” his dad said. “There aren’t going to be any left.” Years later, Jesko returned to Youngstown to work, but he recently quit his job selling products like waterproofing systems to construction companies; his customers had been devastated by the Great Recession and weren’t buying much anymore. Around the same time, a left-knee replacement due to degenerative arthritis resulted in a 10-day hospital stay, which gave him time to think about the future. Jesko decided to go back to school to become a professor. “My true calling,” he told me, “has always been to teach.” One theory of work holds that people tend to see themselves in jobs, careers, or callings. Individuals who say their work is “just a job” emphasize that they are working for money rather than aligning themselves with any higher purpose. Those with pure careerist ambitions are focused not only on income but also on the status that comes with promotions and the growing renown of their peers. But one pursues a calling not only for pay or status, but also for the intrinsic fulfillment of the work itself. When I think about the role that work plays in people’s self- esteem—particularly in America— the prospect of a no-work future seems hopeless. There is no universal basic income that can
  • 55. prevent the civic ruin of a country built on a handful of workers permanently subsidizing the idleness of tens of millions of people. But a future of less work still holds a glint of hope, because the necessity of salaried jobs now prevents so many from seeking immersive activities that they enjoy. After my conversation with Jesko, I walked back to my car to drive out of Youngstown. I thought about Jesko’s life as it might have been had Youngstown’s steel mills never given way to a steel museum—had the city continued to provide stable, predictable careers to its residents. If Jesko had taken a job in the steel industry, he might be preparing for retirement today. Instead, that industry collapsed and then, years later, another recession struck. The outcome of this cumulative grief is that Howard Jesko is not retiring at 60. He’s getting his master’s degree to become a teacher. It took the loss of so many jobs to force him to pursue the work he always wanted to do. Is Google Making Us Stupid? What the Internet is doing to our brains Like The Atlantic? Subscribe to The Atlantic Daily , our free weekday email newsletter. Illustration by Guy Billout
  • 56. 1. "Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?” So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick’s 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial “ brain. “Dave, my mind is going,” HAL says, forlornly. “I can feel it. I can feel it.” 2. I can feel it, too. Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to
  • 57. drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle. 3. I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.) 4. For me, as for others, the Net is becoming a universal medium, the conduit for most of the information
  • 58. that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski. 5. I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes
  • 59. a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?” 6. Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three
  • 60. or four paragraphs is too much to absorb. I skim it.” 7. Anecdotes alone don’t prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits , conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited “a form of skimming activity,” hopping from one source to another and rarely returning to any source they’d already visited. They typically read no more than one or two pages of an article or book before they would “bounce” out to another site. Sometimes they’d save a long article, but there’s no evidence that they ever went back and actually read it. The authors of the study report: It is clear that users are not reading online in the traditional
  • 61. sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense. 8. Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose
  • 62. commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged. 9. Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.
  • 63. 10. Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page. 11. But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. “Perhaps you will through this instrument even take to a new idiom,” the friend wrote in a letter, noting that, in his own work, his “‘thoughts’ in music and language often depend on the quality of pen and paper.” 12. “You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts.” Under the sway of the machine, writes the German media scholar Friedrich A. Kittler ,
  • 64. Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.” 13. The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind “is very plastic.” Nerve cells routinely break old connections and form new ones. “The brain,” according to Olds, “has the ability to reprogram itself on the fly, altering the way it functions.” 14. As we use what the sociologist Daniel Bell has called our “intellectual technologies”—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis
  • 65. Mumford described how the clock “disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.” 15. The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments “remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality.” In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock. 16. The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of
  • 66. software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level. 17. The Internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information- processing device. And that’s what we’re seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It’s becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV. 18. When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its
  • 67. arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration. 19. The Net’s influence doesn’t end at the edges of a computer screen, either. As people’s minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience’s new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info- snippets. When, in March of this year, TheNew York Times decided to devote the second and third pages of every edition to article abstracts , its design director, Tom Bodkin, explained that the “shortcuts” would give harried readers a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles. Old media have little choice but to play by the new- media rules. 20. Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that’s
  • 68. been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure. 21. About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant’s machinists. With the approval of Midvale’s owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an “algorithm,” we might say today— for how each worker should work. Midvale’s employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared. 22. More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor’s tight
  • 69. industrial choreography—his “system,” as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and- motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the “one best method” of work and thereby to effect “the gradual substitution of science for rule of thumb throughout the mechanic arts.” Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. “In the past the man has been first,” he declared; “in the future the system must be first.” 23. Taylor’s system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual
  • 70. lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best method”—the perfect algorithm—to carry out every mental movement of what we’ve come to describe as “knowledge work.” 24. Google’s headquarters, in Mountain View, California—the Googleplex—is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is “a company that’s founded around the science of measurement,” and it is striving to “systematize everything” it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind. 25. The company has declared that its mission is “to organize
  • 71. the world’s information and make it universally accessible and useful.” It seeks to develop “the perfect search engine,” which it defines as something that “understands exactly what you mean and gives you back exactly what you want.” In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can “access” and the faster we can extract their gist, the more productive we become as thinkers. 26. Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. “The ultimate search engine is something as smart as people—or smarter,” Page said in a speech a few years back. “For us, working on search is a way to work on artificial intelligence.” In a 2004 interview with Newsweek, Brin said, “Certainly if you had all the world’s information directly attached
  • 72. to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” Last year, Page told a convention of scientists that Google is “really trying to build artificial intelligence and to do it on a large scale.” 27. Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt’s words, “to solve problems that have never been solved before,” and artificial intelligence is the hardest problem out there. Why wouldn’t Brin and Page want to be the ones to crack it? 28. Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation.
  • 73. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive. 29. The idea that our minds should operate as high-speed data- processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction. 30. Maybe I’m just a worrywart. Just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine. In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their
  • 74. heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom). 31. The arrival of Gutenberg’s printing press, in the 15th century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men “less studious” and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes,
  • 75. “Most of the arguments made against the printing press were correct, even prescient.” But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver. 32. So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking. 33. If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not
  • 76. only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake: I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and “cathedral-like” structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the “instantly available.” 34. As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.” 35. I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s
  • 77. emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence. Assignment 3 Evaluate the arguments of the following articles (in GD>Resources>Readings>Portfolio 2 readings): · “The Internet: Is It Changing the Way we Think?” https://www.theguardian.com/technology/2010/aug/15/internet- brain-neuroscience-debate