A free eBook by Harish Shah, endearingly known as "The Singapore Futurist", presenting scenarios of life in the future, where the emerging technologies of the early 21st Century reach their full potential and converge to effect optimal synergies.
4. 3
Dedication
Firstly, I dedicate this free eBook to my alma mater, The University
of Western Australia. There is no other source from which I have derived
a greater sense of purpose, meaning and inspiration in my life.
Secondly, I dedicate this eBook to my wife, our two children, my
parents and my parents-in-law. They have all borne immense personal
costs upon themselves, in the course of my journey as a Professional
Futurist, the culmination of which up to the point of publishing this
eBook, is this eBook itself. The costs they have borne which I
acknowledge and refer to herein would warrant a printed full-length non-
fiction book in itself.
Last but not least, I dedicate this eBook effort, to all my fellow
credible Professional Futurists from around the world, both past and
present, whose invaluable insights have contributed greatly to the
trajectory of technology’s general evolution, that is taking humanity
towards Techtopia. I salute their passions, selflessness and their personal
sacrifices that would have made their respective individual works
possible.
5. 4
Table of Contents
About the Author ...................................................... 5
Preface ..................................................................... 6
Introduction .............................................................. 8
Chapter 1: Starting the Day .................................... 10
Chapter 2: Morning at the Conference .................... 14
Chapter 3: Taking a Morning Off ............................. 19
Chapter 4: Another School Day ...............................22
Chapter 5: Commuting ............................................ 27
Chapter 6: Social Media ......................................... 30
Chapter 7: Creative Pursuit .................................... 32
Chapter 8: Medicine ............................................... 35
Chapter 9: eSports ................................................. 39
Chapter 10: Cooling the World ............................... 42
6. 5
About the Author
An autodidact and a polymath, Harish Shah is Singapore’s first local
born full-time Professional Futurist. The world leading Futurist in the
area of Technological Evolution & Foresight, he is endearingly referred
to most commonly as “The Singapore Futurist,” apart from also being
referred to as “The Asian Futurist” and “The Millennial Futurist”.
Harish’s Future-Focused essays and articles have been widely
published around the world by various types of publications, including
Singapore’s The Singapore Marketer, published by the Marketing
Institute of Singapore and the now defunct The Futurist which used to be
published by the World Futures Society.
Harish is a Commerce graduate from The University of Western
Australia with Triple Majors in Human Resource Management,
Industrial Relations and Management, where he also studied electives in
Marketing.
As a Future-Focused Keynote Speaker, addressing C-Suite
audiences, Harish is known to step off every stage leaving a profoundly
memorable impression with his unrivalled and unbridled electric energy,
which he draws from his enthusiastic passion for Futures Studies.
7. 6
Preface
There has never been a more exciting time in human history to talk
about either technology or the future in general, before the years leading
up to the end of the first quarter of the twenty-first century, as humanity
witnesses the seemingly abrupt and rapid manifestation of
technologically driven new realities, that were once considered
impossible outside of fantasy or fiction, within the lifetimes of living
contemporary generations. New realities, that are radically reshaping the
very human experience, across every aspect and facet of life.
The rapid technological evolution of the day, in the period when the
writing of this eBook began, is closely accompanied and intertwined,
with the evolution of lifestyles, education and business.
From pessimistic lenses, the writing for this eBook began in a period
bleaker than ever for the human species, with unprecedented scenarios of
trade wars, pandemics, terrorism, border conflicts, rapid climate change,
widening income gaps, recalcitrant persistence in reckless human activity
driving environmental destruction on epic scales, radical politics and
socio-political upheavals, as if a conspiracy has been well in execution to
ensure human extinction.
Yet in the midst of the acutely dampening and devasting chaos,
human ingenuity has been stubbornly thriving and persisting, to create so
many new and different types of technological innovations and
inventions, that neither would have been possible in the century prior,
nor believable beyond academic theory at best as recently as just a
decade prior to the completion of this eBook. Ingenuity, that collectively
in convergence spells hope for humanity towards attaining a
technological utopia, where most of the major problems that humanity
began the twenty-first century with, would no longer continue to exist.
That technological utopia, in this eBook, including in its very title, is
shortened and referred to as “Techtopia”.
This eBook has been written, because it became timely and relevant,
to answer the question; where is the technological evolution, that has
been continuously occurring and accelerating throughout the first quarter
8. 7
of the twenty-first century, taking humanity to? And what will our world
look like, after the end of that journey of evolution?
To paint a simplified palatable picture of the likely human
experience scenarios of the impending future arising from the ongoing
technological evolution, the prime challenge in writing this eBook was
not in determining what to include, but rather in what not to include.
To keep the scenarios presented herein this eBook palatable to a lay
reader, it was necessary to deliberately keep the chapters herein and the
overall eBook concise and therefore intentionally short. Paradoxically,
the effort towards achieving that was the most time-consuming aspect of
the writing process.
At the end of the writing process, the outcome is a short eBook, with
short chapters, consisting of scenarios across its chapters, that suffice for
making and presenting optimal sense to the reasonable reader, of the
future that the technological evolution of the day is leading humanity
into. And it is only fitting, that this eBook is a short read, because when
that future arrives, this eBook will no longer have any purpose for
existence at all anyway. The temporary relevance in time of the material
herein, is aptly and significantly reflective of the effects of time,
technology and evolution.
9. 8
Introduction
This eBook is an attempt to present a plausible glimpse ahead of
time, of a plausibly possible and probable future that awaits us, as a
result of the technological evolution in progress at the time of the writing
of this eBook. This eBook is intended to present that glimpse, by letting
the reader into the mental vision of a Professional Futurist.
Nanotechnology. Robotics. Internet of Everything. 3D-Printing. 4D
Material. Tactile Internet. X Reality. Brain Computer Interface. Artificial
Intelligence. Autonomous Machines. At the time of the writing of this
eBook, the technologies mentioned herein this paragraph, are just some,
amongst an extensive pantheon, in rapid development, towards mass-
commercialisation, even as much of humanity is relatively yet only really
beginning to grapple with the true long-term implications of widespread
internet access.
Taking the technologies, that are mentioned in the preceding
paragraph herein this introduction, at their optimal potential, scenarios
are crafted of the coming convergent effects of those technologies upon
various aspects of life. And those scenarios, built upon what can
reasonably be foreseen of the various emerging technologies today, are
presented in the chapters ahead herein this eBook.
Nothing happens in a silo. Infinite things happen concurrently. And
eventually, everything converges. That is the primordial truth of the
chaotic universe, the galaxy therein, the solar system therein, our planet
earth therein, and thereupon, our lives as human beings.
The Future as a destination painted in this eBook is not going to be
the result of a single journey. The future painted herein will result from a
complex myriad of a very vast number of different and separate journeys
finding convergence at different points in time leading up to it.
When knowledge of science and technology crossed certain
thresholds in the twentieth century, an expansive pantheon of separate
and distinct journeys of technological progress and evolution began,
across different spaces, for different purposes and in different directions.
All of those separate journeys across those wide arrays of directions, will
10. 9
eventually come to converge and merge, fusing to create one end
destination of a pervasive technological reality. What will the human
experience and society on planet earth possibly look like, when that
convergence and fusion is complete? In the chapters ahead in this eBook,
you can expect to gain a small glimpse of precisely such a time, in
response to that very question.
While the picture painted herein is doubtlessly of a distant long-term
future, it is presented to the reader with the intention of providing
inspiration, to work towards that future, for the betterment, of the overall
human experience.
11. 10
Chapter 1: Starting the Day
John's eyes open to sunlight shining through his window. His mind
wanders through the list of warm breakfast options, from across the
world's food cultures, that his fully autonomous and robotically equipped
kitchen can freshly prepare for him, with the ingredients available in the
cabinets and on the shelves within his kitchen.
Once John decides what he wants, the kitchen starts preparing the
first meal of the day, as he gets up from bed, and heads into the
bathroom.
Brainwave sensors around his home remotely detect his thoughts, to
transmit them to the cloud, so that the myriad of devices, appliances and
equipment around him execute his will, seamlessly and instantly. John is
able to constantly interact with his technological assets, both physical
and virtual, dynamically, uninterrupted and directly, via his thoughts.
While brushing his teeth in the bathroom, John quickly reviews the
recording of his dreams from through the night. He selects certain
recordings he finds particularly interesting, instructs the software to carry
out slight tweaks and edits for visual clarity, and he then posts them for
sharing on his social network timeline, in the virtual world, so that his
friends and family can experience those dreams that he had.
While in the shower, John's home management system updates him
on groceries that are running low, as well as, other home essential items
that require replenishing. John mentally instructs the system to make the
purchases with some additions and changes to the list. He wants to try a
new deodorant that he has seen being promoted through social media.
The purchase orders are automatically placed by the system. By the
end of the of day, a driver-less vehicle will arrive at a parking space
closest to John's home and it will dispatch drones carrying all the items
ordered to his home. His domestic robobutler will collect, check and
store the items. The robobutler will alert John to any discrepancy. If there
is none, the payment will be automatically transferred from John's bank
account to the supplier.
12. 11
As John, dressed up and ready for the day ahead of him, sits down at
his dining table for breakfast, his robobutler places the tray with his
freshly prepared breakfast along with hot coffee in front of him. John
puts on his smart glasses that stream sound directly to his skull.
He enters a simulation of the outdoor sitting area of a cafe in Venice.
The weather is particularly pleasant in Venice on this morning. Whatever
John is seeing, is a streamed recording through onsite 360-degree
cameras, captured on the morning of the calendar day prior, at an actual
café site in Venice, to compensate for the actual physical world time
difference between Venice and Singapore where John is. He has virtually
teleport-ed himself to the cafe location in Venice, back in time, to the
day prior. He sees the food that his physical kitchen has prepared for
him, on the simulated table in front of him. He is experiencing a reality
that is a fusion of the actual and the simulated. Next to him, is Alice, who
is also, not really in Venice. She lives close by, about half a kilometre
away from John, in Singapore.
"Nice choice,” Alice comments, on John's chosen simulation for his
breakfast environment. She has peeked into it.
John decides to peek into the simulated environment which Alice has
chosen to enjoy her own breakfast in. She has decided to have breakfast
on a terrace, in Oia, Santorini, Greece, facing the Amoudi Bay. "Yours
too,” John replies, before reverting back to his Venice located cafe.
Both John and Alice see what the other is eating, for real, in this
mixed cross-reality connection that they share, even as they see their own
respective chosen environments around them, streamed and simulated,
with the help of the lenses on their respective smart glasses.
They can jump from one virtual space to another, at will thanks to
the non-invasive Brain Computer Interface points built into both their
smart glasses, as well as, in their home surroundings. They indulge in a
conversation while eating, as if they were seated right next to each other,
in the same physical space. Their images are streamed into each other's
simulated realities, thanks to hardly noticeable micro-cameras placed
around their homes, that capture and stream in real-time, their 360-
13. 12
degree images and movements. They speak to each other with thought,
and hear each other's simulated voices, streamed directly to their skulls.
Before both of them end their breakfast and decide to part for the
day, they agree to meet in the evening, offline, in the physical world.
John pictures a creative, moving, shape-shifting vase in his mind,
that he wants to give as a gift to Alice. He instructs his 3D-Printing
service to produce one for him.
In the current time and age, John no longer subscribes to a phone
company or internet service provider or even a cable television company.
These days, people subscribe to types of services that are very different
from those of the twentieth century or the early twenty-first century, a
period in which John was born. He vividly remembers people walking
around with rectangular objects that were called "smartphones" from
when he was a child. His Mom and Dad used to get monthly bills for
using those.
A swarm of nanobots enter John's apartment through his window,
converging, coordinating and collaborating over a table, to 3D-print the
vase John wants to give to Alice in the evening. It will be precisely
emulating of the image in John's mind when he conceived it, in
appearance, dimensions and transformation functions, thanks to the use
of 4D material.
These nanobots are perpetually floating about, owned and managed
by a service provider, that charges a monthly fee to users, who can
instruct them to 3D-Print objects for them. The bills cover material use,
bot maintenance and management, as well as, mark-up for the company's
profit.
The nanobots draw energy from the sun, or otherwise thermal energy
from the ground. When neither is possible, for whatever reason, there are
charging nodes within proximity of where the swarms are deployed, that
remotely charge the nano-bots, without contact, using stored energy from
other sustainable sources. The nodes also serve as points for reloading
depleted 4D printing materials. The different materials are supplied to
the nodes by autonomous drones, which are deployed when each node
15. 14
Chapter 2: Morning at the Conference
John leads an extermination company. Done with breakfast, he enters
a conference held in the virtual world today, that he is scheduled to
present his company's latest offering at.
The conference is held in a simulated lounge setting with a strict suit
and tie dress code. While John is physically dressed in his casuals,
seating in his armchair in the comfort of his home, in the simulated
setting his avatar appears in a black suit, with a blue shirt and a dark
stripped tie. His hair appears neatly combed to his fellow delegates, as he
takes his virtual seat on a trendy cushioned chair.
It is a "closed door" conference today for industry insiders, bridging
prospective government buyers and private sector sellers. The organizer
presents a welcoming speech. Then there is an Opening Keynote speech.
This is immediately followed by John's product presentation.
John greets his fellow delegates in attendance. They see his lips
moving and hear his voice. However, John is not actually speaking. A
software is artificially simulating his voice in the virtual space, to relay
the words as he thinks them up in his head, which are picked up in real
time by the Brain Computer Interface system embedded in his smart
glasses.
After greeting his fellow delegates, John allows them to experience a
simulation of his company's latest product at work. As part of the
simulation, the conference delegates experience the point of view
perspective of tiny nanobots, that move through drainage systems, hard
to reach spaces under furniture and other spaces that rats typically move
through. The nanobots take cues and directions from the Internet of
Things beacons or nodes that detect rat movements, to track down live
moving rats. Once a group of nanobots close in on a live rat, together,
they discharge remote electrical charges in synchrony, in sufficient
voltage, to cause an instantaneous death of the rat.
After the extermination, different methods are used, for the disposal
of the dead rat, depending on where the rat has been killed.
16. 15
If it is in the sewage or a drainage system or somewhere
underground, a roving incineration drone is activated to reach the spot
where the dead rat is. The drone pulls the rat’s carcass into itself, and
safely but quickly incinerates it at high temperatures down to ashes. The
ashes are then discharged into the drain or sewage system to flow away
with other waste.
If a rat has been killed above ground or in a more reachable space, a
flying drone is deployed to pick up the carcass with a safe and enclosed
containment unit attached to it. Before flying off, the drone discharges
powerful disinfectants and cleans up the spot where the rat has been
exterminated. As the drone flies off, the rat’s carcass is incinerated
within the containment unit. The ashes are then appropriately disposed of
before the flying drone is deployed to the next extermination point, for
the next clean-up.
The drones deployed by John’s company do not emit carbon or other
gasses in or upon the process of incinerating the pests. Rather they retain
the carbon dioxide, to be relayed and released into carbon sinks for
absorption and natural conversion.
The nanobots and the drones are constantly powered by thermal
energy, tapping on heat from the ground or the atmosphere. When they
are in sunlight, they are also powered by solar energy.
After an applause as the demonstration ends, John speaks about the
safety and security measures that ensure that the system will not cause
any harm to human beings or other animals such as domestic pets,
despite being fully automated and autonomous.
The primary intended buyers for the extermination product, as a
service, that John is pitching to, are municipal organisations, building
management companies and estate management organisations.
John ends his presentation and fields a couple of technical and cost
queries from interested prospective buyers. His presentation is followed
by that of a private security vendor, looking to pitch an anti-terrorism
service to representatives from government agencies responsible for state
security and law enforcement.
17. 16
Representing the security company is Haruto. Haruto does not speak
a word of English, for he hardly uses the language himself offline in his
surroundings, though he was schooled in it as an additional language
decades ago. However, the delegates who are present are able hear his
greetings and speech in perfect British English, as his thoughts are
directly translated and relayed in the virtual space, in the likeness of his
voice, in Realtime.
As did John earlier, Haruto commences the simulation of his product
demonstration, for the delegates to experience. They get a 360-degree
perspective of the simulated events. As the simulation begins, an internet
user is seen attempting to access and disseminate content that is
communally divisive in nature with a high propensity to inspire or
instigate motivations for hate, as well as, violence. The content is
disparaging and condescending in nature towards religions other than
that of the internet user himself. This activity is detected and tracked by
artificially intelligent autonomous virtual bots.
As the system assesses and deduces the intentions as well as
motivations of the internet user, the activity is relayed to the nearest local
police personnel with prerogative over the matter. The user is blocked
from accessing the material which he is trying to access. Also blocked,
are his attempt to disseminate messages, with potential or propensity for
incitement, to others. A voice message is dispatched and played to the
internet user through his devices, advising him to cease his divisive
activity, warning him that he is threatening societal peace, order and that
he is breaking the law. He is informed that police officers have been
dispatched to speak with him.
The user is defiant. He shuts down his devices. As he does so, an
aerial unmanned drone scans through his home, past the windows and the
curtains, to detect his movements, forming visuals from heat signatures.
The internet user is packing his bag. It seems that he is intending to run.
The drone crashes into his window to shatter the glass. Once the window
glass is broken, the drone blasts an audio message telling him to remain
within his premises and remain calm, just as a second drone has flown
underneath it, springing open a canvas to collect and prevent the
18. 17
shattered glass from falling onto passers-by along the building on the
ground floor. The user does not comply to the audio message. He rushes
for his front door and opens it. Waiting for him there is another flying
drone that discharges a remote electrical charge at a voltage that is not
lethal for the target but temporarily stuns and paralyses him, long enough
to be incapacitated for the human police personnel to pick him up.
As soon as the internet user's nefarious activity was detected, the
system had already automatically tapped his medical history, to
determine an appropriate electrical voltage to incapacitate him without
killing him.
The police personnel arrive and cut open the grills on the front gate
to the apartment of the man who is now a terrorism suspect. The
personnel then stand aside as a large aerial drone flies through with an
enclosed stretcher. The drone safely lifts him with robotic arms, and load
him onto the stretcher, without the police personnel having to touch him.
The suspect is safely strapped in, and the drone relays him to a safe
hospital facility where he will undergo a series of medical examinations,
before being taken into police custody for detention.
The deliverable of Haruto's product offering means that nothing that
is a threat to society, with a potential to escalate into an armed
insurgency, militancy or use of weapons to carry out terrorist attacks or
trigger divisive violence, may go undetected online, and it is immediately
addressed at the very start of such a possibility developing. The state
authorities, by using such an outsourced service will not be bogged down
with the monitoring and tracking workload. The vendor would maintain
the hardware in the forms of drones for physical offline enforcement, that
the state actors would not need to worry about. It would also mean safety
for police personnel, and diminished chances of potential terrorists
getting away, before the police could reach them.
Haruto then fields questions after his simulated demonstration ends,
on legalities and issues of privacy. There are a good number of questions
raised on ethics. Given that the product minimizes risks of death and
injury to all parties, Haruto gains a good buy-in from the delegates
representing government agencies. One of the government delegates
19. 18
present comments that it would be a game changer in efficiency, in
deterring and countering homegrown terrorism. The most important
draw, is to detect, deter and contain potential terrorists, even before they
acquire capabilities or resources to cause physical harm to others.
At the end of Haruto's presentation, the conference delegates break
for lunch.
20. 19
Chapter 3: Taking a Morning Off
Alice's personal robobutler clears her dining table as she is done with
her breakfast. She is taking the morning off from work and is going
shopping.
She puts on her haptic suit, complete with socks and gloves, covering
all her skin from her soles to the top of her neck. The haptic clothing
allows her to feel distant objects remotely or virtual objects as if they
were real.
Alice enters a Virtual Mall, which she visually experiences around
her thanks to projections on the lenses of her smart glasses. As she sees
an artificially simulated life-like mall environment, crowded with other
shoppers from around the world, her AI Smart Assistant speaks to her,
with the voice streamed right to her skull, about what is new, what is on
offer and what she may be interested in, based on past shopping patterns.
Alice moves into a virtual dress shop. With space limitation not an
issue in virtual worlds, the shops are huge and generously spread out.
Alice walks past endless rows of mannequins. She stops when she sees a
design she likes. A mirror appears in front of Alice, in which she sees
herself dressed in the same dress that she sees on the mannequin next to
her which has caught her eye. The dress is red. She wonders if it will
look better on her, if it were in pink, and the colour of the dress in the
mirror turns pink, down to the exact tone and shade that she has in mind.
Then she wonders if it would be better if the shade of pink was just
slightly darker, and the mirror shows her that precise option. She thinks
to herself that the look is perfect after that last customisation.
With her haptic suit, Alice is able to experience the feel of the fabric
that the dress would be made from on her skin. She thinks it is
comfortable to her liking. The virtual mirror in front of her is able to
show to her an image of herself, from all angles, including her back
without her having to turn or move. Her AI Smart Assistant tells Alice
the price, which seems reasonable. She wants to buy it. Custom to her
measurements and colour requests, the dress can be made and drone-
delivered to Alice through the distributed value chain within three days.
21. 20
However, the dress also promises haptic features, whereby wearing the
dress, anywhere she is on the move, she can remotely hug her loved ones
spread across the world, and experience weather and seasons from
different parts of the world. She wants to test out and experience those
haptic features for herself offline, in the physical world before making
the purchase.
The virtual mirror in front of Alice turns into a virtual screen
displaying a map and the route from her home to the nearest physical
shopping outlet where Alice can try on the dress sample, to test the
haptic features, along with the address and location view. Her AI Smart
Assistant schedules a visit for her to the shop within the week.
After trying on a few more dresses, Alice "teleports" into a toy store
because she wants to buy a present for her nephew's upcoming birthday.
She already has an idea of what she wants. She heads right towards a
dancing bear. It plays its own music, with limitless options as it can
download new songs when they are released from the internet, and it
sounds like it is singing those songs itself for real. It dances on its own
without the need for any touch-based interface or instruction. There is no
"on" or "off" switch and it needs no instructional manual. It is powered
by sunlight through its eyes or thermally by heat from its surroundings if
the atmosphere is warm enough. What Alice likes about it, is that the
bear is able to keep an eye on children, streaming live views of the
children to a parent's smart glasses. It can also automatically raise an
alarm when a child is unwell, at risk or in danger. The bear is also able to
intervene to keep the child safe.
Alice tests the bear by reaching for a virtual power socket. Before
she can touch the socket, the bear races forward and firmly grabs her
wrist to stop her. Then in a friendly and comforting voice, the bear says,
"oh, please don't do that, that is dangerous. Your mommy loves you."
Alice thinks that the bear is adorable. She is informed that it can be
drone-delivered to her within twenty-four hours. She places the order and
the payment amount is automatically deducted from her bank account.
22. 21
Alice window shops for a while more and she still has time before
resuming work after lunch, so she steps into a cinema room within the
virtual mall. She selects a movie about a Motorcycle Grand Prix racer.
Alice experiences herself in the role of the central character in the
movie, experiencing the high speed and thrilling action around her. She
experiences the vibrations of riding a racing motorcycle through her
haptic suit. All the other characters seem real all around her. At various
points in the story, Alice gets carried away forgetting that it is all
simulation, just a movie, and none of it a reality, given the life-like 360-
degree nature of the experience.
The movie ends, just as it is time for lunch. To give her eyes a break,
Alice withdraws from the virtual world and takes off her smart glasses.
She admires the garden along the sills of her windows, that are tended to
by microbots that are hardly noticeable along the plant hedges that they
are built into.
Alice's robobutler is ferrying the food from the autonomous kitchen
to her dining table.
23. 22
Chapter 4: Another School Day
Brothers Bhuvan and Shravan have just finished their breakfast. It is
now time for school. They get comfortable in their respective chairs and
put on their respective smart glasses.
The generation of Bhuvan and Shravan no longer go through tests
and examinations, with scores and grades. Basic school education for
them is entirely about identifying their potential, their intelligence
strength areas, their intrinsic interests, their thinking styles, and then it is
about developing their competencies along those factors, to prepare them
for university education. At the end of their respective twelve-year
school journeys, Bhuvan and Shravan, based on their needs and
aptitudes, will be matched to the most suitable university course, from
anywhere in the world, for a Bachelor's degree. The artificially intelligent
and autonomous internet of the day, supports such a matching function.
Bhuvan is younger. The school system has identified from his
combination of intelligence strengths, interests and inclinations, that he
has a potential for a career in science, technology and engineering. His
personalized and individually tailored education is geared towards
preparing him for his dream job of innovating spacecraft. Bhuvan wants
to work in the space mining industry. He wants to design new
autonomous craft that would reach passing asteroids and comets beyond
the boundaries of the solar system at high speeds, to extract minerals
from them.
The first lesson for Bhuvan today, is for mathematics. The AI
enabled cloud-based programme runs equations down his lenses, at a
pace which he is comfortable with, instantly detecting his aptitude and
ability to solve them through non-intravenous brainwave sensing and
reading. It saves a lot of time, compared to the times when his parents
used to go to school, when it was impossible for teachers to read the
minds of students, and to know precisely if whether a student knew
something or not, without testing them with pen and paper. And the
teachers back then had to cater to the differing needs of twenty to forty
students in a classroom at a time. Today, the teaching job is very
24. 23
different. Today, the teachers create and manage the curriculum that the
cloud-based system administers autonomously, personalizing it for each
student, based on each student's individual unique needs.
When Bhuvan is shown an equation that he is not able to solve, the
system pauses immediately. Bhuvan is then visually shown the
explanations for each part of the formula which has been unable to solve.
He is guided along by a simulated voice streamed directly to his skull.
Where Bhuvan is unable to follow, keep up or remember what is
explained to him, the system automatically detects the lapse by tracking
his brain signals and it automatically stops. The system then deciphers
alternative means and methods to help Bhuvan understand the different
parts and steps in the particular formula. Only when Bhuvan is fully
comfortable with applying the formula effectively, does the system test
whether his learning has been effective, by changing the variables in the
equation where he had been stuck earlier, and letting him try out the
formula’s practical application again. When Bhuvan is successful with
multiple attempts, the system brings up another equation requiring the
application of a formula that Bhuvan was having difficulty learning a
few days prior. Bhuvan is able to solve that equation effectively. The
system then teaches him a new formula which he has not been taught
before, and it helps him to master it, before presenting him with new
equations to test his ability to apply the new formula.
Learning is a lot more efficient in Bhuvan's times, because of the
direct two-way brain-to-cloud interaction. In Bhuvan's world, it is no
longer just about humans interfacing with technology. In the world as it
is now in Bhuvan’s time, technology interfaces with humans. And that
saves a lot of time. What Bhuvan already knows, the system will not
waste time by trying to teach him that. What Bhuvan is able to learn
quickly, the system will not spend more time than is necessary to teach
him that. When Bhuvan has difficulty in understanding or learning
something, the system is able to detect the precise reason through
Bhuvan's brain and thought activities, to solve the pedagogical issues,
and thus ensures his effective learning by alternative means customised
autonomously just for him.
25. 24
If Bhuvan is found to have ineptitude for something, the system will
use that information, to advise him and his parents about it, and will alter
the course of his education ahead, to more appropriately prepare him for
a future university programme. Bhuvan does not have to worry about
failing. With direct-to-brain personalized education, all individuals’
strengths are fully recognized, utilized and optimized. The education
system is no longer about filtering, fragmenting, segregating or
eliminating. The education system of now, is purely about enabling and
empowering.
Shravan, Bhuvan's older brother has been identified early, all the
way back from his days in kindergarten, to have outstanding imagination
and strong potential for creativity. Shravan has an intrinsic musical sense
and interest. He wants to pursue a career in the arts, particularly in music.
He is also quick to pick up on linguistics.
Shravan is starting the day with a lesson for the Italian language,
followed by English. He was learning French the day before. Learning
five languages in total through schooling, Shravan is being prepared to
be truly multilingual.
All the languages are being taught to Shravan in the same manner.
As words, phrases and sentences are projected for him onto his lenses,
the system detects his brain's reactions, to identify gaps in his memory,
interpretation, knowledge or logic for grammar, sentencing and so on, to
correct him immediately. And for each language, he is guided along at at
a pace with which he is comfortable. The pedagogy used is meant to
enable him to express his thoughts effectively in any of the languages he
is learning. Later in life, this will allow him to appreciate and infuse
elements from different cultures into his musical works, as well as,
enable him to write songs across the different languages.
Once his Italian and English lessons are over for the morning,
Shravan is given voice training. It is one of the few subjects in the
present times, wherein school students use their actual voices. In this
subject Shravan is trained to control his volume, pitch and projection,
both while speaking and singing, while being corrected and guided by
the system as and where necessary.
26. 25
After voice training, Shravan has a consultation session with Michael
Jackson, a legendary Pop music artiste who passed away in 2009, long
before Shravan was born. Yesterday, Shravan had a consultation with
Frank Sinatra.
The school system generates simulated studio environments, that
music students like Shravan find themselves virtually immersed in,
through their smart glasses, with legendary artistes of the past brought to
life virtually. In these simulations, the virtually simulated artistes of the
past guide and coach the current students on various aspects of musical
creativity, use of instruments, on dance steps and so on. Shravan's
favourite sessions are with Michael Jackson, because he particularly
loves learning dance moves from him. For that he needs to get out of his
chair and ensure that the space around him in his room is clear, so that he
will not bump into anything, while seeing the virtual studio around him
rather than his actual physical room, as he tries out the dance steps.
While Shravan is spending time learning in the virtual studio from
Michael Jackson, Bhuvan has ended his mathematics lesson and is
experimenting with acids in a virtual chemistry lab, simulated through
his smart glasses. From the safety of his home, with the help of gesture
recognition and tactile technology, Bhuvan is mixing chemicals in the
virtual world, completely safe from any effects of accidental spillage or
chemical reaction from a wrong step. All homes today are setup with
infrastructure to support detection and recognition of the most subtle
gestures, remotely, anywhere within their respective floor areas. The
infrastructure instantly transmits the gesture information to any platform
a household member is using, to support a seamless real time experience,
across the physical and virtual worlds.
Bhuvan is fortunate that the technology of the day supports such
learning of chemistry, safely, for children of his age. When his parents
were of that age, they did not get to learn chemistry. They had to wait
until they were a few years older for safety reasons, regardless of their
aptitude or potential.
At lunch time, both Bhuvan and Shravan end they curricular lessons
for the day. After lunch, they will have to pick up their school bags to
27. 26
head for their physical school campus, in the actual world. School
campuses for the generation of Bhuvan and Shravan no longer require
classrooms, though there are a few that each school maintains, in event
that there is a technical breakdown, and offline physical facilitation of
learning becomes necessary while the technical problems are fixed.
Bhuvan and Shravan are required to go to the physical campus to use
the sporting facilities, such as the running track, football field, swimming
pool and martial arts dojo. For Bhuvan and Shravan, swimming lessons
are compulsory. They will both have swimming lessons today, in the
evening, at the school's open-air pool. Yesterday, both of them were
attending martial arts training. Bhuvan is learning Tae Kwan Do.
Shravan is learning Jiu Jitsu. Tomorrow, Bhuvan will have track
practice, for the hundred and two-metre sprints, while Shravan will be
playing football.
28. 27
Chapter 5: Commuting
Bhuvan and Shravan have just boarded a train from a station just
fifty meters away from their apartment building. They are on their way to
school for their mandatory swimming lessons. It is a fifteen-minute ride
from their starting point to the station right next to the school's entrance,
with all the stops in between.
Bhuvan and Shravan belong to a generation that does not know what
it is like to be on a crowded train. With extensive virtualisation of
business processes, mass automation, ubiquitous robotisation, haptics
and X Reality, compared to two decades prior, there is a far lesser need
for people to regularly or frequently commute for purposes of
occupational work.
The evolution to optimal potential of the Internet of Everything and
automatic maintenance means that train services run rather smoothly and
reliably without technical disruptions. And there is a high frequency of
the service, with a train arriving at every stop within less than three
minutes, with the low cost of renewable energy sources being an
important contributing factor. Especially so, because the train operator
generates its own electricity to run the stations and train services through
a combination of methods that are off the grid, at virtually no cost other
than that to do with the maintenance of the hardware.
As an alternative, with no more drivers on the road, the network of
shared automated vehicles on the road is very intelligently coordinated,
to facilitate smooth, safe and rapid transportation for commuters. Road
congestion is something that is now studied as a topic in human history,
the objective being the promotion of appreciation for contemporary
technology. It is a concept though, that students of the present day have a
genuine difficulty with understanding, despite the simulated
demonstrations created for them as part of the curriculum, for it is an
experience that they cannot relate to when looking at the world around
them outside of simulation.
Generally, to get around, most ordinary people who do not own their
own cars also use the shared autonomous vehicles on the roads, besides
29. 28
the train service. The destination in mind before leaving home is picked
up by the brainwave sensors all around the commuters, allowing their
Smart Assistants to facilitate a coordinated journey. Usually, within a
minute or two of them reaching a road side point along the pavement
closest to their building exits, an available autonomous car will stop right
in front of the commuter. Once the passenger is safely inside the vehicle,
it moves in coordination with the smart road itself along with all the
other vehicles on the road network, to traverse the fastest route possible
to the passenger's destination.
Like road congestions, road accidents are also now a part of history.
Vehicles, coordinated with Artificially Intelligent technology "converse"
with each other constantly at high speeds, sharing their moving speed,
acceleration, deceleration, directions, turns, stoppages and precise
locations, thus disallowing them from touching each other. The sensors
on the vehicles also efficiently detect any object or person in the path.
For example, if a child strays onto a road, the nearest vehicle will detect
the child, and all the vehicles on that street will immediately come to an
immediate halt, without crashing into each other, and remain stationery,
until the child is safely off the road.
With the automation of all vehicles, human error no longer exists,
removing it as a possibility for an accident. One no longer hears of
vehicles going off the roads, crashing into pedestrians or buildings.
Vehicles that can actually be driven manually are now in the ultra-luxury
range, requiring special permits and a special license to drive. And such
cars are very expensive. They are banned from urban areas or population
centres around the world. They can only be driven in designated zones
outside of towns and cities. In the smaller countries, such as Singapore,
such vehicles are not allowed onto the roads at all, so even the ultra-rich
cannot actually own or import them there.
The pavements, for those who prefer travelling on foot, are lined
with travelators. When energy is sustainable, and cheap enough,
someone once decided in a city somewhere to ask: why not? And then a
few other cities quickly followed, and now, people no longer know of a
pavement in any city anywhere, where there isn't a travelator. It is not the
30. 29
fastest way to get around, but it is comfortable and efficient enough for
short distances.
For those who yet cherish time a little more and have no fear of
heights, there is a relatively safe option of air taxis. These are large
autonomous quadcopter drones, that take off from the roof gardens of
buildings, whether residential or commercial. Commuters have to take
the lift to the top floor. The quadcopters are built with vertical
rectangular pods for commuters to comfortably and safely sit in. Unlike
road or railway travel, there is no stoppage of any form from start point
to end point with the quadcopter air taxis.
For Bhuvan and Shravan, to get to school, taking the train is the
more cost-efficient option, compared to the air taxis. Given that it cuts
through the underground, the train route is faster, than the surface routes
via road or travelator. Therefore, taking the train ride to school on a
regular basis is the best option for the brothers.
31. 30
Chapter 6: Social Media
Bhuvan and Shravan have fifteen minutes on the train, before they
reach the station next to their school. They spend the time that they have
on the train ride by immersing themselves into the world of social media.
Bhuvan surfs for location experiences and settles on Seville. He
synchronizes his view with that of a random tourist who is walking
through the city. The tourist streams what he sees through his Smart
Glasses. On the lenses of his own Smart Glasses, while riding on the
train in Singapore, Bhuvan sees exactly what that random tourist he has
synchronized his view with is seeing while walking through Seville.
Visually, the experience for Bhuvan is as good as being there in Seville
himself, as the tourist who is actually there is experiencing.
Shravan logs into a social networking platform where dreams are
shared. He is indulging in what is known as dream-surfing.
In the past, social network platforms supported the sharing of
thoughts in text, pictures and videos by users, on their respective profiles
or timelines. Now, with Brain Computer Interface enabled applications,
people record their dreams while they sleep. When they are awake, they
watch their own dreams, and also share them if they think that they are
interesting and worth sharing.
Shravan goes through the updates from his connections on the
network. When he finds a post to be of interest, he virtually walks into
the recorded dream that has been posted in a simulated multidimensional
environment. Once he walks into the recorded dream, Shravan
experiences the dream as the dreamer would have experienced it while
asleep the night before. The difference is that, as a dream-surfer, Shravan
has the option to end and walk out of the dream at any time, at will.
Today, Shravan chooses to experience a dream uploaded by someone
in Eastern Europe, about a boat ride down a river. What is interesting
about this dream, which Shravan enjoys, is that it is on a river that does
not actually exist in real life. The river, along with its surroundings well
into the horizons, is entirely imagined by the mind of the uploader, who
has dreamt the dream.
32. 31
Experiencing the dream of the boat ride down along the river,
Shravan relishes in the sight of birds resembling real species, but of
colours not ordinarily found in reality. Along the banks on both sides,
there are spectacular landscapes, as if out of the renaissance era
paintings. It amazes Shravan, what the human mind can create as a
dream, while a person sleeps.
The platform on which Shravan dream-surfs and the platform
through which Bhuvan location-surfs are both intelligent enough to
review automatically the age appropriateness of what is streamed,
instantly, to autonomously determine if whether the content being
streamed is suitable for sharing, and if so, the appropriateness for
different age groups. If for example, Bhuvan were to attempt
synchronization of view with another platform user, and that user is in a
location or viewing something that is not suitable for a younger user,
Bhuvan’s view would be redirected or his ability to synchronize his view
with that user would be immediately blocked. Recorded dreams that are
not deemed suitable for younger users of the platform will not turn up as
options for Shravan to dream-surf.
Social Media platforms of the present age, thanks to their Artificial
Intelligence enabled parameters, smartly analyse in real-time the
location, source and visual implications of the content as it is being
uploaded or streamed to intelligently determine its nature, and therefore
suitability or appropriateness for various audiences, to ensure ethical and
safe usage. And this happens instantaneously, without the need for a pair
of human eyes to scrutinise the massive amounts of content being shared
or streamed every second.
33. 32
Chapter 7: Creative Pursuit
Alice is done with lunch, after her morning off from work, and she is
now ready to get back to work.
Alice is an animator. She produces short cartoon movies for children.
She also produces animated edutainment content. It is the perfect job for
her, given a combination of reasons. She loves children. She is a child at
heart and possesses childlike imagination. She is highly creative. And
she is good with storytelling.
Alice produces animation directly from her mind. Thanks to the
Brain Computer Interface technology that is contemporary to her time,
she is able to complete on her own within a single day, what would have
taken a large and diversely talented team weeks or months to accomplish
in a distant past.
Alice produces her content using a variety of different techniques.
One method, is by using recorded dreams, as she is doing today.
Alice’s AI assistant records her dreams while she sleeps. An
autonomous feature of the dream recording system sharpens and adjusts
the visual imagery to render it visually comprehensible and logical.
Alice watches her recorded dreams and if she finds them interesting,
she instructs an artificially intelligent application to automatically
convert them into animated form. She can then add in sounds of her
choice, including for dialogues. She adds the sounds directly from her
imagination, including representations of different voices for different
characters, or, she can do so from available stock sound content which
the application also assists her with. The application automatically
curates the relevant sound files to make suggestions, and it help her to
match the sounds with the appropriate timings of the moving animated
visuals. She is also able to make changes to the animated content created
from her recorded dreams, to create interesting stories and logical flows.
Often times, Alice just starts with watching her dreams. She would
convert one, if she finds it interesting into, an animation. However, once
a dream is converted, she would then completely change the sequence of
34. 33
events to form a story that no longer resembles the initial dream she
would have chosen and started out with.
Her dreams form the inspiration for the stories or the characters or
contexts that she eventually ends up with by the end of the work process,
even if the final products no longer resemble the original dreams in any
other way.
Another way by which Alice creates her animated works, is by
deliberately and consciously visualising moving animation in her
imagination, which she is also able to record in real time. She can then
mentally edit and tweak the images, sounds and stories, after she has
recorded the imagined moving visuals.
All of Alice’s editing is done through a Brain Computer Interface
enabled Head-Mounted Display (HMD) set. Her visual images, imagined
by her mind, are screened in a virtual environment, that Alice is able to
visually experience immersively through the HMD. Where she needs to
make changes or adjustments, she pauses the screening, reimagines the
part, and records it over. When that is done, Alice can choose different
objects and elements in her animated product, that her audiences can
interact with in different ways, when they view it in an immersive
environment themselves. It means therefore, that the children who watch
her cartoon movies, can interact with the characters, to experience being
a part of the stories.
All of the work that Alice does, from start to finish, to complete an
animated product, can be done even by a child who constitutes her
primary target audience, without any training or orientation. There is no
technical skill involved or required. In other words, anybody can animate
using the Brain Computer Interface enabled application that Alice uses.
That includes the creation of moving images and visuals, along with the
addition of sounds and editing for a logical flow in the storyline.
The artistes of Alice’s age no longer need the ability to draw, sketch
or paint with the use of their fingers or their hands. Nor do they need the
skills to toggle with tool bars, buttons and icons on computing devices or
interfaces. They also do not have to use cursors, to drag or move items,
such as the mouse, an artefact her parents and grandparents relied upon,
35. 34
along with the keyboard. In today’s world, they simply just imagine,
mentally instruct, and the programmed applications are able to instantly
execute their instructions automatically. When the applications’ users
think of making changes, the technology will automatically execute the
intended or desired changes that are thought of, directly receiving
instructions through the users’ brain signals.
Alice’s competency, for which she is able to commercially market
and sell her work, that others are willing to pay for, is her creative
imagination and, her ability to tell stories that are attractive to her
specific target audiences. Children love her stories, and how she creates
the characters. Parents find her stories educational, as do educators. And
her creative imagination stands out and apart from the rest. Therefore,
entertainment platform owners and educational bodies are willing to pay
Alice for her content.
For today’s work day, Alice starts off with watching her dream from
the night before. The recording shows that her dream started with her
walking through a park with a puppy. She finds it to her liking. She
converts what she sees into animated form. Then, she inserts a digital
twin of herself into the animation, because in her dream, the image of
herself is not captured. Her idea is to turn it into an educational journey
through the park, in which she talks to her puppy about the different
types of bird and flowers that they come across.
36. 35
Chapter 8: Medicine
Omar is sitting in an observation ward. He was out at a mall with his
teenage great-grandson, window shopping, when his watch beeped,
alerting him to an irregularity in his heartbeat.
The irregularity in Omar’s heartbeat had been picked up by the
sensors in his shirt, which relayed the information to the watch that he
was wearing, triggering the beep to warn him.
Verbal instructions were conveyed via Omar’s smart glasses, directly
to his skull, using Bone Conduction technology, telling him to stop
where he was.
An autonomous wheelchair had then been deployed automatically
within the mall. When the wheelchair reached him, Omar sat in it. The
wheelchair then autonomously brought him out of the mall, through the
nearest exit, and towards the main road. There, Omar had been picked up
by a roving autonomous ambulance.
As the ambulance moved along the fastest route to the hospital,
unhindered as the other autonomous vehicles on the road moved out of
its path, Omar’s vitals had been scanned, assessed and relayed to the
hospital in real time.
Upon arrival, Omar was taken to the observation ward. There the
doctor decided to monitor his heart activity. If necessary, Omar will be
kept there overnight, for continued monitoring.
While Omar’s heartbeat remains irregular, which triggered the alarm
warranting him to be directed to the hospital in the first place, he does
not seem to be in any significant danger for the time being which would
warrant any kind of intravenous intervention.
The field of medicine today is far more about anticipation,
prevention, reconstruction and reversal, than it is about cure.
Clothing and accessories worn by people contribute to immediate
identification of potential health issues or deterioration.
Socks, watches, glasses, footwear and all forms of clothing
constantly scan and identify irregularities in the body. This means, that
parameters like the body temperature, heart rate, blood pressure and
37. 36
respiratory rate are constantly monitored, tracked and evaluated. The
body is also constantly being scanned for possible internal inflammations
or structural damage or injuries. This happens autonomously and
independently. When there is a significant potential cause for concern,
the wearer is alerted, the hospital is notified and ambulatory services are
activated, all at once, instantly.
Beyond the basic parameters, sensors in things that people wear also
monitor and constantly evaluate the parameters that were once only
possible to monitor by invasive means, such as by the extraction of blood
samples for example. In the present day, interstitial fluids in skin tissues
are noninvasively scanned, to track glucose levels. As with non-invasive
glucose monitoring, other parameters which once required invasive
means of assessment are also tracked through different non-invasive
means enabled by contemporary technology, including sodium levels and
blood oxygen levels.
According to what is analysed, through the non-invasive evaluations
conducted by the wearable devices and smart clothing, the individual is
advised by his or her personal devices on necessary dietary tweaks for
the rest of the day or exercise measures, or other lifestyle adjustments as
means to mitigating the threat of issues such as diabetes, blood pressure
problems, risks of stroke and so on, to avoid the need for the physician’s
intervention as far as possible. The principle that the technology now
adheres to, in facilitating such non-invasive monitoring on a constant
basis, along with lifestyle tweaking guidance, is that prevention is better
than cure. And what this means, is that health issues like diabetes or
causes for problems such as stroke are avoided before they become a
problem at all. And this contributes significantly to general population
health and longevity.
With most medical issues or threats or even possible risks detected
and diagnosed early, given the inbuilt functions to do just that, in things
that people commonly wear, coupled with nano-medicine practices
employing nanobot swarms for early intervention, cell-level treatment or
therapy, reversal and reconstruction, the human population now enjoys
not just good enhanced longevity, but a relatively healthy and
38. 37
comfortable ageing process in comparison to what people had been able
to enjoy through the previous chapters in the story of the human species.
Disease prevention and the retention of youth go hand in hand.
Generally, as people grow older, they turn to cell and tissue regeneration
and reconstruction treatments to prevent deterioration of their bodies or
bodily functions. With such treatments which are relatively painless,
efficient and easy, with nanobot swarms injected into the patient’s body
for the task, and then left to work inside the body while the patient lives
his or her normal life, people now retain their youth and youthfulness for
a much longer time, than did the people in the previous generations.
Where intravenous interventions are necessary, swarms of
microscopic nanobots are injected into the patient’s body, as far as
possible, for the purposes of automated precision surgery, therapy or
reconstruction, from within.
For example, nanobot swarms are injected into cancer patients at
early stages. The swarms then safely and autonomously seek out the
cancer cells with precision in the patient’s body, and destroys them, to
prevent growth or further spread.
Besides cancer cells, nanobot swarms are programmed to detect,
identify and eliminate viruses, including newly discovered ones, within
patients’ bodies, when and where a patient is found to be infected. The
use of nanobot swarms for intravenous treatment in viral infections have
greatly aided in mitigating pandemic effects and also enabled the medical
field to successfully cure patients of viral diseases against which there
once had been little hope for cure or survival.
Thyroid issues are addressed by nanobot swarms programmed to
reconstruct the thyroid, while at the same time stabilizing the hormone
levels in the patients affected by either an over-active or an under-active
thyroid.
Nanobot swarms are used to break down cholesterol build-ups inside
blood streams. They are also used to conduct constant internal therapy at
stem cell level, to reverse diabetes in patients, by re-enabling their bodies
to produce insulin naturally, without external intake.
39. 38
Generally, intravenous treatments without nanobot swarms are
mostly only utilized in life and death emergency situations or in cases of
injuries where there are open wounds or bone damage. In such cases,
whether for treating open wounds or conducting surgery, robots are used,
for the purposes of precision and avoidance of human error, under the
watchful eyes of surgeons sitting in the next room.
Surgeons of today “get inside” the body of the patient through 360-
degree holographic simulation. This is done through a combination of
precision scanning and 360-degree nano sized camera-bots deployed
within the patient body, in the area that requires surgical intervention.
With the 360-degree holographic simulations, based on the 360-
degree camera inputs from inside each patient’s body, surgeons immerse
themselves in the visuals of target areas for surgery, to direct the surgical
robots in executing the necessary steps in the procedures.
The robots deliver a precision advantage in surgery over human
hands, because the doctors do not need hand-held controls to direct,
guide or operate the robots. The surgical robots are synchronized to what
the doctors see and think, to operate with precision, thanks to a
combination of the Brain Computer Interface and eye-tracking
technologies. The gaze of the doctors, looking at simulated projections of
the areas to operate upon, are tracked with precision by the Artificially
Intelligent software that intermediates between the doctor and the
surgical robots, to facilitate the robotic surgery. So too are their mental
decisions, pertaining to surgical steps that need to be taken.
As for Omar, his heartbeat seems to be beginning to return to normal
on its own. The doctor is withholding his opinion for the time being, to
monitor the heartbeat for a little while more. However, it seems like
Omar has simply exerted himself a little more than he should have,
causing the irregularity which seems to be a minor short-term reaction,
and which does not warrant any new or further treatment than what he
has already received for his heart issues in the past. Fairly possibly, the
doctor foresees that he will be sending Omar home with advice to better
pace himself while walking outside of home in the future, and to not
push himself too hard.
40. 39
Chapter 9: eSports
Chin Ling is physically wheelchair bound. She is dependent upon
carers for her daily physical routines, as she has no control over her
limbs. In fact, she has no control over her physical bodily functions
below her neck. Her physical limitations have not stopped her however,
from becoming a rising star in the world of eSurfing.
Despite being born and raised in Singapore, which is not a home to
beaches suitable for surfing, and never having been able to physically
train for any sport including swimming, Chin Ling is a mainstay
competitor in the global eSurfing competition circuit. She is fast rising in
her rankings, and hopes to win a world championship in the near future.
Chin Ling competes as an eSurfer, in a simulated life-like virtual
world, wherein real-world surf waves from around the world are
artificially “brought into” simulated realities and recreated in Realtime.
She interfaces with the virtual world through Brain Computer Interfacing
technology.
In the virtual world, thanks to neuro-haptics technology, Chin Ling
experiences physical sensations that unfortunately have never been
possible for her in the physical world.
Through eSurfing, Chin Ling experiences exactly what an offline
physical world surfer would experience, while manoeuvring on a
surfboard, while riding a real surf wave. She experiences the physical
movements and efforts. She experiences the feeling of the wind against
her simulated “body” through her life-like avatar. She experiences the
touch and the temperature of the water. She experiences the pressure of
the waves and the tumbles when she falls. She experiences the splashing
when she goes down into the water.
The difference between real world surfing and eSurfing, is in the
degree of safety. In eSurfing, Chin Ling cannot actually drown in event
of a mishap. She cannot actually get physically hurt.
Unlike real world surfing, eSurfing is not about actual physical skill
or prowess. It is more about the skills or the technique, demonstrated
with the power of the mind.
41. 40
While in today’s world people still participate in or watch physical
sports for appreciation of the physical effort and prowess, the potential of
someone who is capable in mind is no longer limited by the limits of the
physical body.
X Reality, resulting from the fusion of Virtual Reality, Augmented
Reality, Haptics, 360-degree cameras, non-invasive Brain-Computer
Interface, Holographic Simulation, Artificial Intelligence, Tactile
Internet and much else, enables and empowers people, to achieve with
their mental prowess, what their physical limits may otherwise bar them
from achieving. eSports today are avenues for just that.
Those bestowed by nature with physical prowess can still excel with
it by partaking in an actual physical sport which technology does not
mediate. eSport however, is a domain entirely for the mind, wherein one
excels with the ability to out-think or out-imagine another, at a physical
move through a virtual avatar, whether or not that person is physically
capable of those same physical movements in the actual physical reality.
Just as Surfing has its virtual equivalent in the form of eSurfing, so
does virtually every other physical sport. For example, in the present day,
eSoccer is as competitive as the actual physical game, while access to
avenues to compete differ, thanks to the role of technological mediation.
In eSoccer, players from around the world, participate in global
competitions by forming borderless teams through Artificial Intelligence
driven drafts for team formation and selection.
To be drafted into eSoccer teams, aspirants play in virtual trials,
alongside and against simulated players controlled by Artificially
Intelligent software, which automatically assesses their individual
prowess to rank them, in their respective player positions. Based on the
rankings, the software automatically selects and drafts the players to
form the teams. This is similar to how teams are formed for all other
types of team eSports, such as eHockey, eBasketball, eNetball and so on.
While eSports in no way effectively substitute physical exercise for
good health and longevity in general, Chin Ling in particular is grateful
to the technological enablement to compete in a sport, without which she
42. 41
would never have had the opportunity of the personal satisfaction, sense
of achievement and glory, that it has afforded to her.
43. 42
Chapter 10: Cooling the World
Anil works in the Arctic and Antarctica, simultaneously. He has
never physically been to either place. He has however, extensively
explored both, remotely through surrogate machines, as if he has been
there physically nonetheless.
With present day Brain Computer Interface technology, Anil
experiences omnipresence by experiencing simultaneous presence in
multiple locations remotely, where he is not physically present or in the
proximity of, through surrogate humanoid machines which he connects
his mind to.
Anil is in an industry that is highly important for the future of
humanity. He plays a small but highly important role in maintaining
climate conditions and preventing climate change. He does this by
maintaining the massive ice-makers that encircle both the polar regions.
The ice-makers are powered by multiple sources of sustainable
energy. The energy sources include solar, wind and hydroelectricity.
Channelling sea water from the ocean, freezing it, then finely crushing it,
before spraying it to constantly create fresh ice-sheets on the poles to
keep polar temperatures down, the ice-makers run non-stop, around the
clock and throughout the year. To keep them running as such, they need
to be deftly and skilfully maintained even as they are in full operation.
Such maintenance work is highly delicate and dangerous.
Anil is a highly skilled engineer who is able to flawlessly execute
maintenance tasks on the ice-makers, to keep them running smoothly.
The surrogate humanoid machines that he works through remotely, out
of his home in Singapore, not only keep him safe from the otherwise
physical dangers of the job, and the cold of the locations, but also enable
him to be in multiple places at a time for efficiency.
Anil’s work entails assessing the running condition of each ice-
maker, conducting detailed inspections up close, identifying necessary
maintenance actions, then either performing the maintenance tasks
himself where absolutely required, or directing autonomous machines to
44. 43
execute the maintenance tasks, like a part replacement for example,
where the maintenance steps are routine and pre-programmable.
Given the critical nature of the need for maintenance, Anil’s human
supervision is warranted, lest autonomous machines cannot adequately
address the issues that may surface in a timely manner.
To maintain climate conditions, the running of the ice-makers on
both the polar ends of the planet is one of several measures that the
human race now collectively employs.
Every building meant for housing or any other human activity is now
fitted with carbon absorption machines that automatically filter the air, to
absorb excess carbon dioxide from the atmosphere.
Large swarms of aerial robots are now deployed to autonomously
plant trees for reforestation of previously destroyed forests and
afforestation of what were previously desert spaces, to create and
maintain natural carbon sinks.
The tree-planting for reforestation and afforestation is a massive and
systematic factory-like process. The aerial robot swarms are guided by
geo-location guidance systems to chart trajectories by rows and columns
for optimal utilization of space for tree planting and growth.
Autonomous police-bots now patrol designated forest areas, both old
and new, around the world to police against illegal logging, deforestation
or attempts to start fires. The police-bots are fitted with tasers which they
autonomously use to incapacitate persons entering or approaching forests
unauthorised, before apprehending them and transporting them to
detention facilities.
Autonomous firefighting aerial drones now constantly patrol the
skies over forests to spot and put out fires.
To ensure sufficient water distribution to support and sustain the
growth of forests in reforestation and afforestation efforts, autonomous
unmanned aerial vehicles execute cloud seeding above strategic locations
to stimulate rainfall. Autonomous underground burrowing machines are
deployed to drill into river beds, and tunnel through under intended forest
floors to channel water under those areas for optimal underground
irrigation.
45. 44
The various robots deployed to develop, maintain and protect the
forests are powered by multiple means, depending on the conditions
around them. Where weather allows, they are powered by solar energy.
Otherwise, they are charged up by aerial charging drones, that store and
relay sustainable energy from nearby geothermal energy stations.
Beneath the ocean’s surface, submarine nanobot swarms are
delivered by autonomous submersibles to coral reefs for treatment
against past damage. The corals are rehabilitated by the nanobot swarms,
both from the outside and the inside, through cell re-engineering.
The autonomous submersibles that relay and deploy the nanobot
swarms to rehabilitate the damaged corals serve multiple other purposes.
They also deliver and plant newly bioengineered coral larvae with the
aim of expanding the reefs.
The autonomous submersibles also cool the water in the ocean, over
the coral reefs. They draw in the water from around them, and then cool
that water to temperatures just above freezing, before then releasing it
out over the coral reefs. It is a continuous and automatic process. It also
enables the submersibles to generate hydrothermal energy to power
themselves. As the submersibles draw in the water to cool it, they also
filter it to remove pollutants.
While temperature spikes harm the corals, the conservation and
expansion of coral reefs facilitate a wider conservation of a marine
ecosystem sustaining diverse life forms, including plankton and algae
that contribute to the removal of carbon dioxide from the terrestrial
atmosphere. And this helps contribute to the cooling of surface
temperature.
Anil draws satisfaction from his job out of his love for hands-on
engineering. That his work contributes towards preserving the climatic
conditions necessary for long-term human survival is an added reward,
because he cherishes the prospect of continued human life, for future
descendants of his children, Shravan and Bhuvan.
Anil looks up through one of his surrogate machines in Antarctica,
and he sees autonomous drones flying above, headed to brighten the
clouds, to enable them to reflect the sun’s radiation back into outer space.
46. 45
Reflecting radiation back into outer space as such prevents heating of the
poles, which may offset humanity’s cooling efforts, by means of the ice-
makers that Anil maintains.