Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Understanding the brain


Published on

2016 ORAU Annual Meeting of the Council of Sponsoring Institutions
C. Denise Caldwell
Division Director, Physics Division
National Science Foundation

Published in: Government & Nonprofit
  • Be the first to comment

  • Be the first to like this

Understanding the brain

  1. 1. Applications of BIG DATA ANALYTICS ORAU Council of Sponsoring Institutions 9-10 March 2016 Oak Ridge National Laboratory Understanding the Brain C. Denise Caldwell Division Director, Physics Division National Science Foundation 1
  2. 2. NSF Approach to BRAIN at: Understanding the Brain Part of a Broad Emphasis on Neuroscience – Includes BRAIN Consider as a Prototype for Challenges of Bringing Big Data into Individual-Based Science Multi-scale Integration of the Dynamic Activity and Structure of the Brain Neurotechnology and Research Infrastructure Quantitative Theory and Modeling of Brain Function Brain-Inspired Concepts and Designs BRAIN Workforce Development 2
  3. 3. Measurement ComputationModels Journal Z Theory ? Current Serial, Individual Investigator Mode of Doing Science Measurement Computation Theory Models Desperate Need for Parallel, Team Mode of Doing Science Journal X Journal X′ Journal Y Journal Y′ Journal Z′…. …. …. Journal I (Integrated Article) 3
  4. 4. The Data Problem – Solutions in Other Fields Examples of Needs for Success in Team Efforts and in Using Large Amounts of Data Astronomy: Sloan Digital Sky Survey (SDSS) Commonality: Common types of SDSS data; links to all SDSS data access tools (Data Sharing) Protocols and Calibration: Details on the SDSS imaging pipeline and the calibration process (Metadata) Physics: Large Hadron Collider (LHC) Data shared by large scientific collaborations Event Selection: Theory informs whether to keep or discard data; reduces data by 90% 4
  5. 5. In hindsight, these were straightforward: Unified community; well-developed theoretical guidance; common data structures; shared analysis tools; tremendous scientific payoff; well-developed mechanisms for handling large data loads Neuroscience is complicated by additional factors: Non-uniform community (research on different species with different scientific goals) Multiple and disparate data structures (different experimental tools) Lack of experience with large data sets (potentially multiple terabytes) Lack of quantitative mathematical and computational tools (limited pre-select option) ………….. The Challenge Lack of a perceived NEED for the team effort Serial approach is very important and will continue to advance the science BUT: Progress on some problems demands the team approach (Think of the impact that parallel computing had on what could be done with computation.) 5
  6. 6. In vivo imaging Optogenetics Theory and modeling Molecular biology Cognition and Behavior Challenges: Diversity of approaches to study the brain 6 Whole brain imaging Comparative analysis Neuroanatomy
  7. 7. Challenges: Neuroscience is rapidly becoming a computational and data-intensive science  Heterogeneous data from a plethora of techniques across disciplines and scales from cellular to behavioral.  Highly diverse computational modeling and data comparison approaches – across systems and species.  Explosion in imaging techniques that generate “big data”. Volume imaging of whole brain activity in zebrafish produces ~1 TB of data, Freeman et al. 2014 7
  8. 8. The Brain Imaging Problem Whole mouse brain at 1nm (synaptic) resolution is ~1 zettabytes (1021 bytes). Atlas of the human genome is ~ 3.2 Gigabytes. By any current measure, problem is intractable – must be broken down into subsets Serial approach: Very small subsets – linked together in latter phases Parallel approach: Larger subsets, optimized to consider all experimental, computational, and theoretical needs from the outset; potential for linking at earlier stages 8