This document provides a roadmap for achieving whole brain emulation (WBE), which is the modeling of the human brain accurately enough to mimic its behavior. It outlines the key technologies needed, including high-resolution brain scanning, neural simulation, and computer hardware advances. It acknowledges that WBE faces major challenges from our limited understanding of the brain and uncertainties around what level of detail is required. The roadmap is intended to stimulate discussion and further research toward the goal of WBE, while also identifying areas that require more experimentation to reduce uncertainties.
Fabrication methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to fabrication methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Overview of nanomaterials - Nanoscience and nanotechnologiesNANOYOU
An introduction to nanomaterials.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Fundamental "Nano-effects" - Nanoscience and nanotechnologiesNANOYOU
An introduction to the fundamental Nano-effects.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Application of nanotechnologies: Medicine and healthcareNANOYOU
An introduction to the applications of nanotechnologies in medicine.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Characterization methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to characterization methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Examples of nanoscience that can be found in nature.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Fabrication methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to fabrication methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Overview of nanomaterials - Nanoscience and nanotechnologiesNANOYOU
An introduction to nanomaterials.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Fundamental "Nano-effects" - Nanoscience and nanotechnologiesNANOYOU
An introduction to the fundamental Nano-effects.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Application of nanotechnologies: Medicine and healthcareNANOYOU
An introduction to the applications of nanotechnologies in medicine.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Characterization methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to characterization methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Examples of nanoscience that can be found in nature.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
An introduction to the application of nanotechnologies within the information and communication technologies.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Informatics in technical reality is coming nearer and nearer to the “biological” human being.
It seems to be necessary to protect humans against excessive demands by hardware of
Cyber world and especially against computers as machines and their software.
The human being as evolutional species on one side is measured in a very long development
time of thousands of years, the IT on the other side works with human creativity and in
much faster time intervals of some years or - maximal - decenniums only. These times show
a big difference in both areas. The execution-times of the biological human being and his
general surroundings on one hand and the IT with her applications – in electronic speeds -
on the other hand are an outstanding conflict of time. Humans don’t act like computers!
The structure of this work is given by the biological organism, but also the thinking and
feeling of a human being. He has senses, instruments for movement and mental abilities as a
whole. The human being in nature science is a biological object and in the human society an
individual subject with own self awareness und personal intelligence. Compared with all living
subjects he has the highest developed consciousness of his own person. The modern science
seems to make them unimportant.
The goal of this work is, to find the biological a n d psychological borders of the human
being and protect him in a preventive medical kind against coming dangers like for inst. bad
stress and following sickness. Theoretical Informatics has to find out principles, rules and
ways of thinking for human-orientated IT to avoid this danger.
There is no demand on finding out all at once but a beginning on a scientific level is intended.
This work is thought as the first foundation and compendium for all main themes of HO
(Human Orientation).
Besides a speciality out of biology is described new - The Rules of Mendel, a biological
stimulus for informatics.
For spring semester senior design, my team of five designed an artificial gravity habitat to study the affects of cosmic radiation on an aquaponics system while removing the effects of zero gravity.
An introduction to the application of nanotechnologies within the information and communication technologies.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Informatics in technical reality is coming nearer and nearer to the “biological” human being.
It seems to be necessary to protect humans against excessive demands by hardware of
Cyber world and especially against computers as machines and their software.
The human being as evolutional species on one side is measured in a very long development
time of thousands of years, the IT on the other side works with human creativity and in
much faster time intervals of some years or - maximal - decenniums only. These times show
a big difference in both areas. The execution-times of the biological human being and his
general surroundings on one hand and the IT with her applications – in electronic speeds -
on the other hand are an outstanding conflict of time. Humans don’t act like computers!
The structure of this work is given by the biological organism, but also the thinking and
feeling of a human being. He has senses, instruments for movement and mental abilities as a
whole. The human being in nature science is a biological object and in the human society an
individual subject with own self awareness und personal intelligence. Compared with all living
subjects he has the highest developed consciousness of his own person. The modern science
seems to make them unimportant.
The goal of this work is, to find the biological a n d psychological borders of the human
being and protect him in a preventive medical kind against coming dangers like for inst. bad
stress and following sickness. Theoretical Informatics has to find out principles, rules and
ways of thinking for human-orientated IT to avoid this danger.
There is no demand on finding out all at once but a beginning on a scientific level is intended.
This work is thought as the first foundation and compendium for all main themes of HO
(Human Orientation).
Besides a speciality out of biology is described new - The Rules of Mendel, a biological
stimulus for informatics.
For spring semester senior design, my team of five designed an artificial gravity habitat to study the affects of cosmic radiation on an aquaponics system while removing the effects of zero gravity.
This presentation is designed to cover some of the principles of Basic Life Support & First Aid for Children as of May 2014. It follows the Australian Resuscitation Guidelines and uses the DRSABCD approach.
D - Danger
R - Response
S - Send for Help
A - Airways
B - Breathing
C - CPR
D - Defib.
It is intended for lay-people and healthcare students.
About the book: This book traces the history of AI from the early dreams of eighteenth-century (and earlier) pioneers to the more successful work of today’s AI engineers. With AI becoming so much a part of everyone’s life, the technology is already embedded in face-recognizing cameras, speech-recognition software, Internet search engines, and health-care robots. The book’s many diagrams and easy-to-understand descriptions of AI programs you gain an understanding of how these and other AI systems work. End-of-chapter notes containing citations to valuable source materials are of great use to AI scholars and researchers, and the book promises to be the definitive history of a field that has captivated the imaginations of scientists, philosophers, and writers for centuries.
About the author: Nils J. Nilsson, Kumagai Professor of Engineering (Emeritus) in the Department of Computer Science at Stanford University, California, received his Ph.D. degree in Electrical Engineering from Stanford in 1958. He spent twenty-three years at the Artificial Intelligence Center of SRI International working on statistical and neural-network approaches to pattern recognition, co-inventing the A* heuristic search algorithm and the STRIPS automatic planning system, directing work on the integrated mobile robot, SHAKEY, and collaborating in the development of the PROSPECTOR expert system. He published five textbooks on artificial intelligence, taught courses on artificial intelligence and machine learning, and researched flexible robots that react to dynamic worlds, plan courses of action, and learn from experience. Professor Nilsson served on the editorial boards of the journal Artificial Intelligence and of the Journal of Artificial Intelligence Research and was an Area Editor for the Journal of the Association for Computing Machinery. He is a past-president and Fellow of the American Association for Artificial Intelligence, a Fellow of the American Association for the Advancement of Science, and recipient of the IEEE Neural-Network Pioneer award, the IJCAI Research Excellence award, and the AAAI Distinguished Service award.
From sound to grammar: theory, representations and a computational modelMarco Piccolino
This thesis contributes to the investigation of the sound-to-grammar mapping by developing a computational model in which complex acoustic patterns can be represented conveniently, and exploited for simulating the prediction of English prefixes by human listeners.
The model is rooted in the principles of rational analysis and Firthian prosodic analysis, and formulated in Bayesian terms. It is based on three core theoretical assumptions: first, that the goals to be achieved and the computations to be performed in speech recognition, as well as the representation and processing mechanisms recruited, crucially depend on the task a listener is facing, and on the environment in which the task occurs. Second, that whatever the task and the environment, the human speech recognition system behaves optimally with respect to them. Third, that internal representations of acoustic patterns are distinct from the linguistic categories associated with them.
The representational level exploits several tools and findings from the fields of machine learning and signal processing, and interprets them in the context of human speech recognition. Because of their suitability for the modelling task at hand, two tools are dealt with in particular: the relevance vector machine (Tipping, 2001), which is capable of simulating the formation of linguistic categories from complex acoustic spaces, and the auditory primal sketch (Todd, 1994), which is capable of extracting the multi-dimensional features of the acoustic signal that are connected to prominence and rhythm, and represent them in an integrated fashion. Model components based on these tools are designed, implemented and evaluated.
The implemented model, which accepts recordings of real speech as input, is compared in a simulation with the qualitative results of an eye-tracking experiment. The comparison provides useful insights about model behaviour, which are discussed.
Throughout the thesis, a clear distinction is drawn between the computational, representational and implementation devices adopted for model specification.
In this doctoral thesis, a classical, lumped-element model is used to study the cochlea and to simulate click-evoked and spontaneous OAEs. The original parameter values describing the microscopic structures of the cochlea are re-tuned to match several key features of the cochlear response in humans. The frequency domain model is also recast in a formulation known as state space; this permits the calculation of linear instabilities given random perturbations in the cochlea which are predicted to produce spontaneous OAEs. The averaged stability results of an ensemble of randomly perturbed models have been published in [(2008) ‘Statistics of instabilities in a state space model of the human cochlea,’ J. Acoust. Soc. Am. 124(2), 1068-1079]. These findings support one of the prevailing theories of SOAE generation.
Quantum Variables in Finance and Neuroscience Lecture SlidesLester Ingber
Background
About 7500 lines of PATHINT C-code, used previously for several systems, has been generalized from 1 dimension to N dimensions, and from classical to quantum systems into qPATHINT processing complex (real + $i$ imaginary) variables. qPATHINT was applied to systems in neocortical interactions and financial options. Classical PATHINT has developed a statistical mechanics of neocortical interactions (SMNI), fit by Adaptive Simulated Annealing (ASA) to Electroencephalographic (EEG) data under attentional experimental paradigms. Classical PATHINT also has demonstrated development of Eurodollar options in industrial applications.
Objective
A study is required to see if the qPATHINT algorithm can scale sufficiently to further develop real-world calculations in these two systems, requiring interactions between classical and quantum scales. A new algorithm also is needed to develop interactions between classical and quantum scales.
Method
Both systems are developed using mathematical-physics methods of path integrals in quantum spaces. Supercomputer pilot studies using XSEDE.org resources tested various dimensions for their scaling limits. For the neuroscience study, neuron-astrocyte-neuron Ca-ion waves are propagated for 100's of msec. A derived expectation of momentum of Ca-ion wave-functions in an external field permits initial direct tests of this approach. For the financial options study, all traded Greeks are calculated for Eurodollar options in quantum-money spaces.
Results
The mathematical-physics and computer parts of the study are successful for both systems. A 3-dimensional path-integral propagation of qPATHINT for is within normal computational bounds on supercomputers. The neuroscience quantum path-integral also has a closed solution at arbitrary time that tests qPATHINT.
Conclusion
Each of the two systems considered contribute insight into applications of qPATHINT to the other system, leading to new algorithms presenting time-dependent propagation of interacting quantum and classical scales. This can be achieved by propagating qPATHINT and PATHINT in synchronous time for the interacting systems.
Introduction to Objectual Philosophy or Mathematical Principles of Natural Philosophy. You can read or download the book from http://filosofia.obiectuala.ro/en/
Existential Risk Prevention as Global PriorityKarlos Svoboda
•Existential risk is a concept that can focus long-term global efforts and sustainability concerns.
• The biggest existential risks are anthropogenic and related to potential future technologies.
• A moral case can be made that existential risk reduction is strictly more important than any other global public
good.
• Sustainability should be reconceptualised in dynamic terms, as aiming for a sustainable trajectory rather than a sustainable state.
• Some small existential risks can be mitigated today directly (e.g. asteroids) or indirectly (by building resilience and
reserves to increase survivability in a range of extreme scenarios) but it is more important to build capacity to
improve humanity’s ability to deal with the larger existential risks that will arise later in this century. This will
require collective wisdom, technology foresight, and the ability when necessary to mobilise a strong global coordinated response to anticipated existential risks.
• Perhaps the most cost-effective way to reduce existential risks today is to fund analysis of a wide range of existential risks and potential mitigation strategies, with a long-term perspective
Brain computer interaction and medical access to the brainKarlos Svoboda
This paper discusses current clinical applications and possible future uses of brain-computer interfaces (BCIs) as a means for communication, motor control and entertainment. After giving a brief account of the various approaches to direct brain-computer interaction, the paper will address individual, social and ethical implications of BCI technology to extract signals from the brain.
These include reflections on medical and psychosocial benefits and risks, user control, informed consent, autonomy and privacy as well as ethical and social issues implicated in putative future developments with focus on human self-understanding and the idea of man. BCI use which involves direct interrelation and mutual interdependence between human brains and technical
devices raises anthropological questions concerning self-perception and the technicalization of the human body.
A rights based model of governance - the case of human enhancementKarlos Svoboda
The current development of technology and scientificresearch may give rise to several
applications on human beings. In this context, emerging technologies can further foster
the applications on human beings and pave the way for new and incisive research towards human enhancement (HE).
2
Thanks to emerging technologies, HE can be more
effective and represent a concrete challenge for present societies, also in Europe. Scientists of the Northwestern University Feinberg School of Medicine, for instance, recently created a brain-synthesized estrogen that influences the synaptic structure,
function and cognitive processes by augmenting the networks among neurons (Svrivastava et al. 2010). Thus it could be a case of future brain-doping.
Ethics of security and surveillance technologies opinion 28Karlos Svoboda
In addition, surveillance of the public by companies or
by other individuals should be subject to conditions,
and again, the opinion addresses the principles that
govern these forms of ‘commercial’ or individual
surveillance, and the manner in which the data so
gathered may be used as part of a data mining or profiling system by private entities or the state.
The digital revolution and subsequent advances in
mobile, wireless and networked devices have significantly contributed to the development of security and surveillance technologies. New technologies offer the possibility of recording the everyday activities of billions of individuals across the globe. Our mobile phones can identify and pinpoint our location at any given moment, loyalty cards allow commercial entities to analyse our spending and track our personal preferences, keystroke software monitors our performance and productivity in the workplace and our electronic communications can be screened for key words or phrases by intelligence services. Moreover, personal data concerning our health, employment, travel and electronic communications are stored in databases,
and data mining techniques allow for large amounts
of personal data from these disparate sources to be
organised and analysed, thereby facilitating the discovery of previously unknown relationships within these
data. Security technologies are no longer discrete; the trend is toward convergence, creating more powerful
networked systems. Thus, our everyday lives are scrutinised by many actors as never before, all made possible by developments in technology together with political choices or lack thereof.
Emerging Technoethics of Human Interaction with Communication, Bionic and Rob...Karlos Svoboda
AHS may be that f... system
In this deliverable, the protection and promotion of human rights is explored in connection with various case-studies in robotics, bionics, and AI agent technologies. This is done along various dimensions, prominently including human dignity, autonomy, responsibility, privacy,liberty, fairness, justice, and personal identity.
Ethical case-studies in robotics concern learning robots, unmanned combat air vehicles,robot companions, surgery robots, and a robotic street cleaning system. Case-studies illustrating current developments of the field with imminent potential applications comprise the robotic street cleaning system, surgery robots, and the unmanned air vehicles. Robots making extensive use of learning capabilities and robots acting as companions to human
beings represent somewhat more distant possibilities, enabling one to connect in meaningful ways an analysis of short-term ethical issues in robotics with a pro-active interest in longterm ethical issues.
The bionics case-studies considered here concern specific kinds of implants in the human body, investing the human peripheral or central nervous system, and other kinds of noninvasive brain-computer interfaces. These case-studies are closely related to the robotics case-studies, insofar as these bionic technologies enable one to connect to and often control robotic effectors and sensors. Ethical issues examined in connection with these technologies concern both a short-term perspective, mostly arising from their therapeutic uses, and a longterm perspective, mostly arising from the possibility of extending communication, control, cognitive, and perceptual capabilities of both disabled and non-disabled individuals.
This networking of humans with both robotic and computer-based information systems motivates the inclusion of a case-study about AI agent technologies in this report, concerning systems that have been with us for quite a while, that is, adaptive hypermedia systems for
educational applications. These technologies enable one to design and implement software agents that are similar to robotic agents, also from an ethical standpoint, insofar as they are capable of, e.g., autonomous action, reasoning, perception, and planning.
Ethical issues examined in this report will be amplified from the convergence of softbot and robotic technologies directly interacting with human beings and other biological systems by means of bionic interfaces. This long-term perspective shows that the case-studies examined here - which are significant in their own right from the isolated perspectives of robotics,bionics, and AI - can soon become parts of broader ethical problems that we will have to address and come with in the near future.
Some futurists and artificial intelligence experts envision credible scenarios in which synthetic brains will, within this century, extend the functionality of our own brains to the point where they will rival and then surpass the power of an or-ganic human brain. At the same time, humans seem to have no limitations when it comes to finding ways to attack the computerized devices that others have invent-ed. Attackers have successfully compromised computers, mobile phones, ATMs, telephone networks, and even networked power grids. If neural devices fulfill the promise of treatment, and enhance our quality of lives and functionality—which appears likely, given the preliminary clinical success demonstrated from neuropros-thetics— their use and adoption will likely grow in the future. When this happens, inevitably, a wide variety of legal, security, and public policy concerns will follow. We will begin this article with an overview of brain implants and neural devic-es and their likely uses in the future. We will then discuss the legal issues that will arise from the intersection among neural devices, information security, cybercrime, and the law.
Nanotechnology, ubiquitous computing and the internet of thingsKarlos Svoboda
The aim of this report is to provide a review of current developments in nanotechnology, ubiquitous
computing and what is increasingly being referred to as “domotics” – the integration of domestic architectures (domus) with information systems and devices (imformatics). The report will also provide a preliminary analysis of the potential impacts of these developments on the right to privacy and to data protection.
These areas of technological development represent the convergence of two domains of current research – nanoscience and distributed computing. Much of the existing literature suggests that advances in nanotechnology are likely to operate as a underlying suite of techniques that will enable the development of miniaturised and distributed information systems and the integration of informatics devices into a range of everyday consumer goods and household architectures. As we outline below the convergence of nanotechnology and research in ubiquitous and distributed systems is likely to result in the development of a range of new sensor technologies and advances in surveillance and monitoring techniques, deployed in civilian, military and security contexts. For these reasons advances in nanotechnology and ubiquitous computing are likely to intensify existing concerns associated with data collection and the right to privacy.
In order to provide some background to our review of these issues in this section of the report we outline definitions of the field and current trends in surveillance, data-mining and monitoring.
Identity REvolution multi disciplinary perspectivesKarlos Svoboda
The identity [r]evolution is happening. Who are
you, who am I in the information society ?
In recent years, the convergence of several factors – technological, political, economic –
has accelerated a fundamental change in our networked world. On a technological level, information
becomes easier to gather, to store, to exchange
and to process. The belief that more information
brings more security has been a strong political
driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one’s behaviour, or needs, or preferences. It can lead to
categorizations according to some specific risk criteria, for example, or to direct and personalized
marketing. As a consequence, new forms of identities appear. They are not necessarily related to our
names anymore. They are based on information,
on traces that we leave when we act or interact,
when we go somewhere or just stay in one place,
or even sometimes when we make a choice. They
are related to the SIM cards of our mobile phones,
to our credit card numbers, to the pseudonyms
that we use on the Internet, to our email addresses,
to the IP addresses of our computers, to our profiles… Like traditional identities, these new forms of
identities can allow us to distinguish an individual
within a group of people, or describe this person as
belonging to a community or a category.
Intimate technology - the battle for our body and behaviourKarlos Svoboda
This essay aims to spark a wave of public and political debate about a series of new products already showered out over you, the volume of which will continue to increase during the coming years. This essay takes a serious look at the trend that technology is rapidly nesting itself in between us, very close to us and even within us, increasingly coming to know us and even receiving human traits. In short, we have become human-machine mixtures, cyborgs.
For ages, humans have developed cures for diseases and devised techniques which make the hardships of life more endurable. All these were believed to make human life more
humane, i.e. to help humans to live out their inherent (natural, God-given) potentiality to a fuller extent. Recent technology, known as human enhancement, challenges this 'natural'normativity: going beyond restoring wellbeing and optimizing human potentiality, enhancement also develops capacities which can, in a sense, be called new. Chemicals have become available that increase physical performance in, for example the field of sports. Other chemicals enhance psychological endurance, mood, and cognition. Work is in progress on developing functional implants within the body, such as computer chips
integrated in the brain, with the aim of enhancing performance beyond what humans are naturally capable of. Changes are being made to body cells and systems, and techniques are being discussed to change human genes. Finally, techniques are being developed, and in part already applied, which extend the human life-span. Human Enhancement is about trying to make changes to minds and bodies – to characteristics, abilities, emotions
and capacities – beyond what we currently regard as normal.
Making perfect life european governance challenges in 21st Century Bio-engine...Karlos Svoboda
The STOA project ‘Making Perfect Life’ looked into four fields of 21st century bioengineering: engineering of living artefacts, engineering of the body, engineering
of the brain, and engineering of intelligent artefacts. This report describes the main results of the project.
The report shows how developments in the four fields of bio-engineering are shaped by two megatrends: “biology becoming technology” and “technology becoming biology”. These developments result in a broadening of the bioengineering debate in our society. The report addresses the long term viewsthat are inspiring this debate and
discusses a multitude of ethical, legal and social issues that arise from bioengineering developments in the fields described. Against this background four specific developments are studied in more detail: the rise of human genome sequencing, the market introduction of neurodevices, the capturing by information technology of the psychological and physiological states of users, and
the pursuit of standardisation in synthetic biology. These developments are taken in this report as a starting point for an analysis of some of the main European governance challenges in 21st century bio-engineering.
GRAY MATTERS Integrative Approaches for Neuroscience, Ethics, and SocietyKarlos Svoboda
The Presidential Commission for the Study of Bioethical Issues (Bioethics Commission) is an advisory panel of the nation’s leaders in medicine, science, ethics, religion, law, and engineering. The Bioethics Commission advises the President on bioethical issues arising from advances in biomedicine and related areas of science and technology. The Bioethics Commission seeks to identify and promote policies and practices that ensure scientific research, health care delivery, and technological innovation are conducted in a socially and ethically responsible manner. For more information about the Bioethics Commission, please see http://www. bioethics.gov.
GRAY MATTERS Integrative Approaches for Neuroscience, Ethics, and Society
Brain emulation-roadmap-report
1. Whole Brain Emulation
A Roadmap
(2008) Technical Report #2008‐3
Anders Sandberg*
Nick Bostrom
Future of Humanity Institute
Faculty of Philosophy & James Martin 21st Century School
Oxford University
CITE: Sandberg, A. & Bostrom, N. (2008): Whole Brain Emulation: A Roadmap, Technical Report #2008‐3, Future of
Humanity Institute, Oxford University
URL: www.fhi.ox.ac.uk/reports/2008‐3.pdf
(*) Corresponding author: anders.sandberg@philosophy.ox.ac.uk
2. In memoriam: Bruce H. McCormick (1930 – 2007)
2
3. Contents
Whole Brain Emulation............................................................................................................................1
A Roadmap ................................................................................................................................................1
In memoriam: Bruce H. McCormick (1930 – 2007)...........................................................................2
Contents..................................................................................................................................................3
Introduction ...........................................................................................................................................5
Thanks to ............................................................................................................................................6
The concept of brain emulation..........................................................................................................7
Emulation and simulation...............................................................................................................7
Little need for whole‐system understanding...............................................................................8
Levels of emulation and success criteria.....................................................................................10
Scale separation...............................................................................................................................12
Simulation scales.............................................................................................................................13
WBE assumptions ...........................................................................................................................15
Roadmap ..............................................................................................................................................16
Requirements...................................................................................................................................16
Linkages............................................................................................................................................19
Roadmap ..........................................................................................................................................20
Technology drivers.........................................................................................................................23
Uncertainties and alternatives......................................................................................................24
Alternative pathways.....................................................................................................................27
Related technologies and spin‐offs ..............................................................................................28
Issues.....................................................................................................................................................30
Emulation systems..........................................................................................................................30
Complications and exotica ............................................................................................................31
Summary ..........................................................................................................................................39
Scanning ...............................................................................................................................................40
Embedding, fixation and staining techniques ...........................................................................52
Conclusion .......................................................................................................................................53
Image processing and scan interpretation......................................................................................55
Geometric adjustment....................................................................................................................55
Noise removal .................................................................................................................................56
Data interpolation...........................................................................................................................56
Cell tracing .......................................................................................................................................57
Synapse identification....................................................................................................................59
Identification of cell types .............................................................................................................60
Estimation of parameters for emulation .....................................................................................61
Connectivity identification............................................................................................................62
Conclusion .......................................................................................................................................63
Neural simulation ...............................................................................................................................64
How much neuron detail is needed?...........................................................................................64
Neural models .................................................................................................................................66
Simulators ........................................................................................................................................70
Parallel simulation..........................................................................................................................70
Current large‐scale simulations....................................................................................................71
Conclusion .......................................................................................................................................72
Body simulation ..................................................................................................................................74
Conclusion .......................................................................................................................................75
Environment simulation....................................................................................................................76
3
4. Vision ................................................................................................................................................76
Hearing .............................................................................................................................................77
Smell and Taste ...............................................................................................................................77
Haptics..............................................................................................................................................77
Conclusion .......................................................................................................................................78
Computer requirements ....................................................................................................................79
Conclusions......................................................................................................................................81
Validation.............................................................................................................................................82
Discussion ............................................................................................................................................83
Appendix A: Estimates of the computational capacity/demands of the human brain ..........84
Appendix B: Computer Performance Development ....................................................................86
Processing Power............................................................................................................................86
Memory ............................................................................................................................................95
Disc drives........................................................................................................................................97
Future................................................................................................................................................98
Appendix C: Large‐scale neural network simulations...............................................................101
Appendix D: History and previous work.....................................................................................105
Appendix E: Non‐destructive and gradual replacement...........................................................107
Non‐Destructive Scanning ..........................................................................................................107
Gradual replacement....................................................................................................................108
Appendix F: Glossary.......................................................................................................................110
References ..........................................................................................................................................113
4
5. Introduction
Whole brain emulation (WBE), the possible future one‐to‐one modelling of the function of the
human brain, is academically interesting and important for several reasons:
• Research
o Brain emulation is the logical endpoint of computational neuroscience’s
attempts to accurately model neurons and brain systems.
o Brain emulation would help us to understand the brain, both in the lead‐up
to successful emulation and afterwards by providing an ideal test bed for
neuroscientific experimentation and study.
o Neuromorphic engineering based on partial results would be useful in a
number of applications such as pattern recognition, AI and brain‐computer
interfaces.
o As a long‐term research goal it might be a strong vision to stimulate
computational neuroscience.
o As a case of future studies it represents a case where a radical future
possibility can be examined in the light of current knowledge.
• Economics
o The economic impact of copyable brains could be immense, and could have
profound societal consequences (Hanson, 1994, 2008b). Even low probability
events of such magnitude merit investigation.
• Individually
o If emulation of particular brains is possible and affordable, and if concerns
about individual identity can be met, such emulation would enable back‐up
copies and “digital immortality”.
• Philosophy
o Brain emulation would itself be a test of many ideas in the philosophy of
mind and philosophy of identity, or provide a novel context for thinking
about such ideas.
o It may represent a radical new form of human enhancement.
WBE represents a formidable engineering and research problem, yet one which appears to
have a well‐defined goal and could, it would seem, be achieved by extrapolations of current
technology. This is unlike many other suggested radically transformative technologies like
artificial intelligence where we do not have any clear metric of how far we are from success.
In order to develop ideas about the feasibility of WBE, ground technology foresight and
stimulate interdisciplinary exchange, the Future of Humanity Institute hosted a workshop on
May 26 and 27, 2007, in Oxford. Invited experts from areas such as computational
neuroscience, brain‐scanning technology, computing, nanotechnology, and neurobiology
presented their findings and discussed the possibilities, problems and milestones that would
have to be reached before WBE becomes feasible.
The workshop avoided dealing with socioeconomic ramifications and with philosophical
issues such as theory of mind, identity or ethics. While important, such discussions would
undoubtedly benefit from a more comprehensive understanding of the brain—and it was this
understanding that we wished to focus on furthering during the workshop. Such issues will
likely be dealt with at future workshops.
5
6. This document combines an earlier whitepaper that was circulated among workshop
participants, and additions suggested by those participants before, during and after the
workshop. It aims at providing a preliminary roadmap for WBE, sketching out key
technologies that would need to be developed or refined, and identifying key problems or
uncertainties.
Brain emulation is currently only a theoretical technology. This makes it vulnerable to
speculation, “handwaving” and untestable claims. As proposed by Nick Szabo, “falsifiable
design” is a way of curbing the problems with theoretical technology:
…the designers of a theoretical technology in any but the most predictable of areas
should identify its assumptions and claims that have not already been tested in a
laboratory. They should design not only the technology but also a map of the
uncertainties and edge cases in the design and a series of such experiments and tests
that would progressively reduce these uncertainties. A proposal that lacks this
admission of uncertainties coupled with designs of experiments that will reduce such
uncertainties should not be deemed credible for the purposes of any important
decision. We might call this requirement a requirement for a falsifiable design. (Szabo,
2007)
In the case of brain emulation this would mean not only sketching how a brain emulator
would work if it could be built and a roadmap of technologies needed to implement it, but
also a list of the main uncertainties in how it would function and proposed experiments to
reduce these uncertainties.
It is important to emphasize the long‐term and speculative nature of many aspects of this
roadmap, which in any case is to be regarded only as a first draft—to be updated, refined,
and corrected as better information becomes available. Given the difficulties and
uncertainties inherent in this type of work, one may ask whether our study is not premature.
Our view is that when the stakes are potentially extremely high, it is important to apply the
best available methods to try to understand the issue. Even if these methods are relatively
weak, it is the best we can do. The alternative would be to turn a blind eye to what could turn
out to be a pivotal development. Without first studying the question, how is one to form any
well‐grounded view one way or the other as to the feasibility and proximity of a prospect like
WBE?
Thanks to
We would like to warmly thank the many people who have commented on the paper and
helped extend and improve it:
Workshop participants: John Fiala, Robin Hanson, Kenneth Jeffrey Hayworth, Todd
Huffman, Eugene Leitl, Bruce McCormick, Ralph Merkle, Toby Ord, Peter Passaro, Nick
Shackel, Randall A. Koene, Robert A. Freitas Jr and Rebecca Roache.
Other useful comments: Stuart Armstrong.
6
10. kinds of low level phenomena. We also need some understanding of higher level phenomena
to test our simulations and know what kind of data we need to pursue. Fostering the right
research cycle for developing the right understanding, collecting data, improving
instrumentation, and experimenting with limited emulations – in addition to providing
useful services to related fields and beneficial spin‐offs – would be indispensable for the
development of WBE.
Levels of emulation and success criteria
For the brain, several levels of
6a 6b 6c
Social role-fit Mind emulation Personal success criteria for emulation
emulation identity can be used. They form a
emulation
hierarchy extending from
low‐level targets to complete
emulation. See Table 1 on
5 page 11.
Individual brain
emulation
Not shown in this hierarchy
are emulations of subsystems
or small volumes of the brain,
4
Species “partial brain emulations”.
generic brain
emulation
Properly speaking, a
complete scan, parts list and
3 brain database (1a, 1b and 2)
Functional do not constitute successful
brain emulation
brain emulation, but such
achievements (and partial
brain emulations) would in
2 any case be important
Brain database
milestones and useful in
themselves.
Similarly, the high‐level
1a 1b achievements related to social
Parts List Complete scan roles, mental states, and
personal identify (6a, 6b and
6c) are both poorly‐
Figure 2: Success levels for WBE. understood and hard to
operationalize, but given the
philosophical interest in WBE we have included them here for completeness. It is not obvious
how these criteria relate to one another, or to what extent they might be entailed by the
criteria for 4 and 5.
Achieving the third success criterion beyond a certain resolution would, assuming some
supervenience thesis, imply success of some or all of the other criteria. A full quantum‐
mechanical N‐body or field simulation encompassing every particle within a brain would
plausibly suffice even if “quantum mind” theories are correct. At the very least a 1‐to‐1
material copy of the brain (a somewhat inflexible and very particular kind of emulating
computer) appears to achieve all criteria, possibly excepting those for 6c. However, this is
10
11. likely an excessively detailed level since the particular phenomena we are interested in (brain
function, psychology, mind) appear to be linked to more macroscopic phenomena than
detailed atomic activity.
Given the complexities and conceptual issues of consciousness we will not examine criteria
6abc, but mainly examine achieving criteria 1‐5.
Table 1: Success Criteria
Level Success criterion Relevant properties
1a “Parts list” An inventory of all objects on a particular Low level neural structure,
size scale, their properties and interactions. chemistry, dynamics accurate to
resolution level.
1b “Complete scan” A complete 3D scan of a brain at high Resolution, information enabling
resolution. structure to function mapping.
2 “Brain database” Combining the scan and parts list into a 1‐to‐1 mapping of scan to
database mapping the low level objects of a simulation/emulation objects.
brain.
3 “Functional brain The emulation simulates the objects in a Generically correct causal micro‐
emulation” brain database with enough accuracy to dynamics.
produce (at least) a substantial range of
species‐typical basic emergent activity of the
same kind as a brain (e.g. a slow wave sleep
state or an awake state).
4 “Species generic brain The emulation produces the full range of Long term dynamics and
emulation” (human) species‐typical emergent behavior adaptation. Appropriate
and learning capacity. behaviour responses. Full‐range
learning capacity.
5 “Individual brain The emulation produces emergent activity Correct internal and behaviour
emulation” characteristic of that of one particular (fully responses. Retains most
functioning) brain. It is more similar to the memories and skills of the
activity of the original brain than any other particular brain that was
brain. emulated. (In an emulation of an
animal brain, it should be
possible to recognize the
particular (familiar) animal.)
6a “Social role‐fit The emulation is able to fill and be accepted Properties depend on which
emulation”/“Person into some particular social role, for example (range of) social roles the
emulation” to perform all the tasks required for some emulation would be able to fit. In
normally human job. (Socio‐economic a limiting case, the emulation
criteria involved) would be able to pass a
personalized Turing test:
outsiders familiar with the
emulated person would be
unable to detect whether
responses came from original
person or emulation.
6b “Mind emulation” The emulation produces subjective mental The emulation is truly conscious
states (qualia, phenomenal experience) of the in the same way as a normal
same kind that would have been produced human being.
by the particular brain being emulated.
(Philosophical criteria involved)
6c “Personal identity The emulation is correctly described as a The emulation is an object of
emulation” continuation of the original mind; either as prudentially rational self‐concern
numerically the same person, or as a for the brain to be emulated.
surviving continuer thereof. (Philosophical
criteria involved)
11
13.
Simulation scales
The widely reproduced diagram from (Churchland and Sejnowski, 1992) in Figure 3 depicts
the various levels of organisation in the nervous system ordered by size scale, running from
the molecular level to the entire system. Simulations (and possibly emulations) can occur on
all levels:
• Molecular simulation (individual molecules)
• Chemistry simulation (concentrations, law of mass action)
• Genetic expression
• Compartment models (subcellular volumes)
• Whole cell models (individual neurons)
• Local network models (replaces neurons with network modules such as
minicolumns)
• System models
Another hierarchy was introduced by John Fiala during the workshop, and will be used with
some adaptations in this document.
Table 2: Levels of emulation
Level
1 Computational “Classic AI”, high level representations of information and information
module processing.
2 Brain region Each area represents a functional module, connected to others according
connectivity to a (species universal) “connectome” (Sporns, Tononi et al., 2005).
3 Analog network Neurons populations and their connectivity. Activity and states of
population model neurons or groups of neurons are represented as their time‐averages. This
is similar to connectionist models using ANNs, rate‐model neural
simulations and cascade models.
4 Spiking neural As above, plus firing properties, firing state and dynamical synaptic states.
network Integrate and fire models, reduced single compartment models (but also
some minicolumn models, e.g. (Johansson and Lansner, 2007)).
5 Electrophysiology As above, plus membrane states (ion channel types, properties, state), ion
concentrations, currents, voltages and modulation states. Compartment
model simulations.
6 Metabolome As above, plus concentrations of metabolites and neurotransmitters in
compartments.
7 Proteome As above, plus concentrations of proteins and gene expression levels.
8 States of protein As above, plus quaternary protein structure.
complexes
9 Distribution of As above, plus “locome” information and internal cellular geometry.
complexes
10 Stochastic behaviour As above plus molecule positions, or a molecular mechanics model of the
of single molecules entire brain.
11 Quantum Quantum interactions in and between molecules.
The amount of understanding needed to accurately simulate the relevant objects tends to
increase radically for higher (here, low‐numbered) levels: while basic mechanics is well
understood, membrane biophysics is complex, and the computational functions of brain areas
are likely exceedingly multifaceted. Conversely, the amount of computing power needed
increases rapidly as we descend towards lower levels of simulation, and may become
fundamentally infeasible on level 112. The amount and type data needed to fully specify a
2 But see also the final chapter of (Hameroff, 1987). The main stumbling block of level 11 simulation may not be
computing hardware or understanding but fundamental quantum limitations on scanning.
13
18. Preparation
Scanning Physical handling Resolution
Imaging Volume
Functional
information
Geometric Noise removal
adjustment
Image processing
Data interpolation Tracing
Mathematical
Whole model
brain Translation Software model of
emulation neural system
Efficient
implementation
Parameter
estimation
Synapse
identification
Scan Connectivity
interpretation identification
Cell type
identification
Databasing
Environment
simulation
CPU
Simulation Body Simulation
Bandwidth
Storage
Figure 4: Technological capabilities needed for WBE.
18
19.
Table 3: Capabilities needed for WBE
Preparing brains appropriately,
Preprocessing/fixation retaining relevant
microstructure and state
Methods of manipulating fixed
Physical handling brains and tissue pieces before,
during, and after scanning
Capability to scan entire brain
Scanning
Volume volumes in reasonable time
and expense.
Scanning at enough resolution
Imaging Resolution
to enable reconstruction
Scanning is able to detect the
Functional information functionally relevant properties
of tissue
Handling distortions due to
Geometric adjustment
scanning imperfection
Data interpolation Handling missing data
Image processing Noise removal Improving scan quality
Detecting structure and
Tracing processing it into a consistent
3D model of the tissue
Cell type identification Identifying cell types
Identifying synapses and their
Translation Synapse identification
connectivity
Estimating functionally
Scan interpretation
Parameter estimation relevant parameters of cells,
synapses, and other entities
Storing the resulting inventory
Databasing
in an efficient way
Model of entities and their
Software model of neural Mathematical model
behaviour
system
Efficient implementation Implementation of model
Storage of original model and
Storage
current state
Efficient inter‐processor
Bandwidth
communication
Processor power to run
CPU
Simulation simulation
Simulation of body enabling
Body simulation interaction with virtual
environment or through robot
Virtual environment for virtual
Environment simulation
body
Linkages
Most of the capabilities needed for WBE are independent of each other, or form synergistic
clusters. Clusters of technologies develop together, supporting and motivating each other
with their output. A typical example is better mathematical models stimulating a need for
better implementations and computing capacity, while improvements in the latter two
stimulate interest in modelling. Another key cluster is 3D microscopy and image processing,
where improvements in one makes the other more useful.
There are few clear cases where a capability needs a completed earlier capability in order to
begin development. Current fixation and handling methods are likely unable to meet the
19
21. Human
emulation
Large mammal
emulation
Validation Ground truth Appropriate
methods models level
Small mammal
emulation
Deducing
function
Invertebrate
emulation
Scanning Complete
development inventory
Partial
emulations
Interpretation Automated Eutelic
development pipeline organism
emulation
Low-level Organism
neuroscience simulation
Full cell
simulation
Body
simulation
Simulation
hardware
Figure 5: WBE roadmap.
The key milestones are:
Ground truth models: a set of cases where the biological “ground truth” is known and can be
compared to scans, interpretations and simulations in order to determine their accuracy.
Determining appropriate level of simulation: this includes determining whether there exists
any suitable scale separation in brains (if not, the WBE effort may be severely limited), and if
so, on what level. This would then be the relevant scale for scanning and simulation.
Full cell simulation: a complete simulation of a cell or similarly complex biological system.
While strictly not necessary for WBE it would be a test case for large‐scale simulations.
21
23.
Technology drivers
Preparation
Scanning Physical handling Resolution
Imaging Volume
Functional
information
Geometric Noise removal
adjustment
Image processing
Data interpolation Tracing
Mathematical
Whole model
brain Translation Software model of
emulation neural system
Efficient
implementation
Parameter
estimation
Synapse
identification
Scan Connectivity
interpretation identification
Cell type
identification
Databasing
Environment
simulation
CPU
Simulation Body Simulation
Bandwidth
Storage
Moore’s law Commercial Research WBE specific?
driven drivers drivers
Figure 6: Technology drivers for WBE‐necessary technologies.
Different required technologies have different support and drivers for development.
23
26. towards large‐scale neuroscience, where automated methods will play an increasingly
prominent role as they have done in genomics.
Figure 8: Caenorhabditis elegans, a popular model organism with a fully mapped 302
neuron nervous system.
Selection of suitable model systems. Selecting the right targets for scanning and modelling
will have to take into account existing knowledge, existing research communities, likelihood
of funding and academic impact as well as practical factors. While the C. elegans nervous
system has been completely mapped (White, Southgate et al., 1986), we still lack detailed
electrophysiology, likely because of the difficulty of investigating the small neurons. Animals
with larger neurons may prove less restrictive for functional and scanning investigation but
may lack sizeable research communities. Molluscs such as freshwater snails and insects such
as fruit flies may be promising targets. They have well characterised brains, existing research
communities and neural networks well within current computational capabilities.
Similarly, the selection of subsystems of the brain to study requires careful consideration.
Some neural systems are heavily studied (cortical columns, the visual system, the
hippocampus) and better data about them would be warmly received by the research
community, yet the lack of characterization of their inputs, outputs and actual function may
make development of emulation methods very hard. One system that may be very promising
is the retina, which has an accessible geometry, is well studied and somewhat well
understood, is not excessively complex, and better insights into which would be useful to a
wide research community. Building on retinal models, models of the lateral geniculate
nucleus and visual cortex may be particularly useful, since they would both have relatively
well‐defined inputs from the previous stages.
At what point will the potential be clear enough to bring major economic actors into WBE?
Given the potentially huge economic impact of human WBE (Hanson, 1994, 2008a, 2004,
2008b), if the field shows sufficient promise, major economic actors will become interested in
funding and driving the research as an investment. It is unclear how far advanced the field
would need to be in order to garner this attention. Solid demonstrations of key technologies
are likely required, as well as a plausible path towards profitable WBE. The impact of funding
on progress will depend on the main remaining bottlenecks and their funding elasticity. If
scanning throughput or computer power is the limiting factor, extra funding can relatively
easily scale up facilities. By contrast, limitations in neuroscience understanding are less
responsive to investment. If funding arrives late, when the fundamental problems have
already been solved, the amplified resources would be used to scale up pre‐existing small‐
scale demonstration projects.
26