XII Experimental Chaos and Complexity Conference talk, Michigan US.
Related papers:
http://journals.aps.org/pre/abstract/10.1103/PhysRevE.88.012712
http://www.biomedcentral.com/1471-2202/14/S1/O18
Este documento discute a lógica fuzzy, definindo-a como uma lógica multivalorada que permite graus parciais de pertinência em conjuntos. Explica como a lógica fuzzy modela o raciocínio humano impreciso através de conjuntos fuzzy, regras fuzzy e um processo de inferência difusa estilo Mamdani. Finalmente, lista alguns domínios de aplicação como sistemas especialistas e controle inteligente.
2006: Computação Bioinspirada - Novas Perspectivas para Pesquisa em BiologiaLeandro de Castro
O documento discute a computação bio-inspirada e como a biologia pode servir como inspiração para o desenvolvimento de ferramentas computacionais. Aborda conceitos como auto-organização, simulação e adaptação. Apresenta casos de estudo como redes neurais artificiais, inteligência de enxame e sistemas imunológicos artificiais inspirados em sistemas biológicos.
High-throughput computation and machine learning methods applied to materials...Anubhav Jain
High-throughput computation and machine learning methods can be applied to materials design problems at scale. Density functional theory (DFT) allows modeling of materials at the quantum mechanical level but large computational resources are required. "High-throughput DFT" uses automation, parallelization across supercomputers, and data mining approaches to rapidly screen millions of potential new materials in silico before experimental validation. This helps address the challenge of discovering new materials for applications like energy technologies by searching the vast space of possible compositions and structures more efficiently than traditional experimentation alone.
Going Smart and Deep on Materials at ALCFIan Foster
As we acquire large quantities of science data from experiment and simulation, it becomes possible to apply machine learning (ML) to those data to build predictive models and to guide future simulations and experiments. Leadership Computing Facilities need to make it easy to assemble such data collections and to develop, deploy, and run associated ML models.
We describe and demonstrate here how we are realizing such capabilities at the Argonne Leadership Computing Facility. In our demonstration, we use large quantities of time-dependent density functional theory (TDDFT) data on proton stopping power in various materials maintained in the Materials Data Facility (MDF) to build machine learning models, ranging from simple linear models to complex artificial neural networks, that are then employed to manage computations, improving their accuracy and reducing their cost. We highlight the use of new services being prototyped at Argonne to organize and assemble large data collections (MDF in this case), associate ML models with data collections, discover available data and models, work with these data and models in an interactive Jupyter environment, and launch new computations on ALCF resources.
Zhou Changsong presents a document discussing the brain as a complex dynamical network system subject to constraints of cost and function. It aims to reconcile irregular neuronal spiking with neural avalanches through a biologically plausible neuronal network model and statistical physics analysis. The key findings are that the model shows coexistence of irregular spiking, oscillations, and critical avalanches through a dynamical mechanism of Hopf bifurcation in the mean field model that explains critical neural avalanches corresponding to irregular spiking in the microscopic neuronal network model. This multiscale variability in brain activity reflects principles of cost-efficient neural representation and dynamics.
PowerPoint slides from a 2015 Guest Lecture in PSYCH-268A: Computational Neuroscience, Prof. Jeff Krichmar, University of California, Irvine (UCI).
Corresponding publication:
Beyeler*, M., Carlson*, K. D. , Chou*, T-S., Dutt, N., Krichmar, J. L. (2015). CARLsim 3: A user-friendly and highly optimized library for the creation of neurobiologically detailed spiking neural networks. Proceedings of IEEE International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland. (*equal contribution)
Analogy, Causality, and Discovery in Science: The engines of human thoughtCITE
13 January 2015, Tuesday
12:45 pm – 2:00 pm
has been changed to RMS 101, Runme Shaw Bldg., HKU
By Professor Kevin Niall DUNBAR,
College of Education, University of Maryland, College Park, US
http://sol.edu.hku.hk/analogy-causality-discovery-science-engines-human-thought/
Este documento discute a lógica fuzzy, definindo-a como uma lógica multivalorada que permite graus parciais de pertinência em conjuntos. Explica como a lógica fuzzy modela o raciocínio humano impreciso através de conjuntos fuzzy, regras fuzzy e um processo de inferência difusa estilo Mamdani. Finalmente, lista alguns domínios de aplicação como sistemas especialistas e controle inteligente.
2006: Computação Bioinspirada - Novas Perspectivas para Pesquisa em BiologiaLeandro de Castro
O documento discute a computação bio-inspirada e como a biologia pode servir como inspiração para o desenvolvimento de ferramentas computacionais. Aborda conceitos como auto-organização, simulação e adaptação. Apresenta casos de estudo como redes neurais artificiais, inteligência de enxame e sistemas imunológicos artificiais inspirados em sistemas biológicos.
High-throughput computation and machine learning methods applied to materials...Anubhav Jain
High-throughput computation and machine learning methods can be applied to materials design problems at scale. Density functional theory (DFT) allows modeling of materials at the quantum mechanical level but large computational resources are required. "High-throughput DFT" uses automation, parallelization across supercomputers, and data mining approaches to rapidly screen millions of potential new materials in silico before experimental validation. This helps address the challenge of discovering new materials for applications like energy technologies by searching the vast space of possible compositions and structures more efficiently than traditional experimentation alone.
Going Smart and Deep on Materials at ALCFIan Foster
As we acquire large quantities of science data from experiment and simulation, it becomes possible to apply machine learning (ML) to those data to build predictive models and to guide future simulations and experiments. Leadership Computing Facilities need to make it easy to assemble such data collections and to develop, deploy, and run associated ML models.
We describe and demonstrate here how we are realizing such capabilities at the Argonne Leadership Computing Facility. In our demonstration, we use large quantities of time-dependent density functional theory (TDDFT) data on proton stopping power in various materials maintained in the Materials Data Facility (MDF) to build machine learning models, ranging from simple linear models to complex artificial neural networks, that are then employed to manage computations, improving their accuracy and reducing their cost. We highlight the use of new services being prototyped at Argonne to organize and assemble large data collections (MDF in this case), associate ML models with data collections, discover available data and models, work with these data and models in an interactive Jupyter environment, and launch new computations on ALCF resources.
Zhou Changsong presents a document discussing the brain as a complex dynamical network system subject to constraints of cost and function. It aims to reconcile irregular neuronal spiking with neural avalanches through a biologically plausible neuronal network model and statistical physics analysis. The key findings are that the model shows coexistence of irregular spiking, oscillations, and critical avalanches through a dynamical mechanism of Hopf bifurcation in the mean field model that explains critical neural avalanches corresponding to irregular spiking in the microscopic neuronal network model. This multiscale variability in brain activity reflects principles of cost-efficient neural representation and dynamics.
PowerPoint slides from a 2015 Guest Lecture in PSYCH-268A: Computational Neuroscience, Prof. Jeff Krichmar, University of California, Irvine (UCI).
Corresponding publication:
Beyeler*, M., Carlson*, K. D. , Chou*, T-S., Dutt, N., Krichmar, J. L. (2015). CARLsim 3: A user-friendly and highly optimized library for the creation of neurobiologically detailed spiking neural networks. Proceedings of IEEE International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland. (*equal contribution)
Analogy, Causality, and Discovery in Science: The engines of human thoughtCITE
13 January 2015, Tuesday
12:45 pm – 2:00 pm
has been changed to RMS 101, Runme Shaw Bldg., HKU
By Professor Kevin Niall DUNBAR,
College of Education, University of Maryland, College Park, US
http://sol.edu.hku.hk/analogy-causality-discovery-science-engines-human-thought/
This document discusses the physics of intelligence and computation by biological systems. It introduces a physics-based theory of computation based on two axioms: 1) the most basic operation is to distinguish something from nothing, and 2) computation must abide by causality. It describes how a cortical neuron can be reverse-engineered using this framework by modeling its inputs and outputs as codebooks and developing statistical distributions and decision rules to model its behavior. The goal is to develop a fundamental theory of computation that applies broadly to intelligent and biological systems based on quantifying causality and distinction.
How can we harness the Human Brain Project to maximize its future health a...SharpBrains
In early 2013, the European Union selected the Human Brain Project, coordinated by Lausanne’s Federal Institute of Technology (EPFL), as the recipient of over 1 billion euros/ 1.3 billion dollars over the next ten years. How can the research agenda of this major initiative, and closely related ones, be organized and augmented with partnerships with the private sector and cross-sector stakeholders? How can we start building brain heath innovation platforms and delivery systems at the intersection of neuroscience, IT, and engineering?
- Chair: Hilal Lashuel, Associate Professor at the Swiss Federal Institute of Technology-Lausanne (EPFL), YGL Class of 2012
- Sean Hill, co-Director of the Blue Brain Project and co-Director of Neuroinformatics in the Human Brain Project (HBP) at the Swiss Federal Institute of Technology-Lausanne (EPFL)
This session took place at the 2013 SharpBrains Virtual Summit: http://sharpbrains.com/summit-2013/agenda/
Computational Training for Domain Scientists & Data LiteracyJoshua Bloom
Data literacy connects the knowledge of deep concepts of statistics, computer science, visualization, and domain science with the practical understanding of the data---and the stories it tells---that we encounter in our lives. As data becomes more pervasive, we see teaching data literacy to students as part of a broad education as an imperative for 21st century. Learning how to arm the next generation with the tools to make the most of data, and avoid the common pitfalls of data, will be a major thrust of our efforts with the Moore/Sloan initiative at Berkeley, through the Berkeley Institute for Data Science.
This curriculum vitae summarizes Mark V. Albert's educational and professional background. He received a Ph.D. in Computational Biology from Cornell University, where he researched neural development and coding. His work has included postdoctoral research at Northwestern University, research consulting, and graduate studies at Carnegie Mellon University focusing on computational neuroscience. He has received several honors and awards for his academic and research achievements.
Combining density functional theory calculations, supercomputing, and data-dr...Anubhav Jain
The document summarizes how computational materials science using density functional theory (DFT) calculations, supercomputing, and data-driven methods can help design new materials faster than traditional experimental approaches. It describes how high-throughput DFT calculations are run on supercomputers to screen large numbers of potential materials. The results are compiled in open databases like the Materials Project to be shared and reused by researchers. While computational limitations remain, combining computation and data is helping accelerate the discovery of new materials with improved properties for applications like batteries, thermoelectrics, and carbon capture.
Edited and revised: Overview of the international and interdisciplinary Gordon Research Conference on Visualization in Science and Education and info on key cognitive science and other visualization researchers. History of the conference, NSF workshop, and research on learning with visualizations.
Complex systems are characterized by constituents -- from neurons in the brain to individuals in a social network -- which exhibit special structural organization and nonlinear dynamics. As a consequence, a complex system cannot be understood by studying its units separately because their interactions lead to unexpected emerging phenomena, from collective behavior to phase transitions.
Recently, we have discovered that a new level of complexity characterizes a variety of natural and artificial systems, where units interact, simultaneously, in distinct ways. For instance, this is the case of multimodal transportation systems (e.g., metro, bus and train networks) or of biological molecules, whose interactions might be of different type (e.g. physical, chemical, genetic) or functionality (e.g., regulatory, inhibitory, etc.). The unprecedented newfound wealth of multivariate data allows to categorize system's interdependency by defining distinct "layers", each one encoding a different network representation of the system. The result is a multilayer network model.
Analyzing data from different domains -- including molecular biology, neuroscience, urban transport, telecommunications -- we will show that neglecting or disregarding multivariate information might lead to poor results. Conversely, multilayer models provide a suitable framework for complex data analytics, allowing to quantify the resilience of a system to perturbations (e.g., localized failures or targeted attacks), improving forecasting of spreading processes and accuracy in classification problems.
Feature Extraction and Classification of NIRS DataPritam Mondal
A thesis paper submitted to the department of Electronics and Communication
Engineering, Khulna University of Engineering & Technology, Khulna, Bangladesh, in
partial fulfillment of the requirement for the degree of “Bachelor of Science” in Electronics
and Communication Engineering
Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measur...sonix022
The document discusses theoretical limits on recovering tree-sparse signals from noisy linear measurements. It presents a simple adaptive sensing algorithm that can exactly recover the support of a tree-sparse signal using a minimal number of measurements. The algorithm operates by adaptively selecting measurement locations based on previous measurements. The document proves that this algorithm can recover the support if the nonzero coefficients are sufficiently large, and poses the open question of whether any algorithm can achieve support recovery with significantly smaller coefficients. It places this work in the context of establishing fundamental limits on exact support recovery for both structured and unstructured sparse signals under both non-adaptive and adaptive sensing strategies.
Materials discovery through theory, computation, and machine learningAnubhav Jain
The document discusses using theory, computation, and machine learning to discover new materials. It summarizes that density functional theory (DFT) can model material properties from first principles, and how DFT calculations have been automated and run on supercomputers to enable high-throughput screening of materials. Examples are given of computations predicting new materials that were later experimentally confirmed, like sidorenkite cathodes for sodium ion batteries. Related projects are outlined like the open-source Materials Project database of DFT data on over 85,000 materials and software libraries to support high-throughput computation and materials science. Text mining of scientific literature is also discussed to help predict new materials in advance.
Distance oracle - Truy vấn nhanh khoảng cách giữa hai điểm bất kỳ trên đồ thịHong Ong
Bài review cách tính nhanh khoảng cách giữa hai điểm bất kỳ trên đồ thị. Ứng dụng trong nhiều lĩnh vực như: telecome, internet routing, social network analysis, etc.
In the mid-1990s, the high-energy physics community (think FermiLab and CERN) started planning for the Large Hadron Collider. Managing the petabytes of data that would be generated by the facility and sharing it with the globally distributed community of over 10,000 researchers would be a major infrastructure and technology problem. This same community that brought us the web has now developed standards, software, and infrastructure for grid computing. In this seminar I'll present some of the exciting science that is being done on the Open Science Grid, the US national cyberinfrastructure linking 60 institutions (Harvard included) into a massive distributed computing and data processing system.
This document provides an overview of computational neuroscience from modeling single neurons to neural circuits and behavior. It discusses:
- Models of single neurons from the Hodgkin-Huxley model to reduced models like FitzHugh-Nagumo and Izhikevich neurons.
- How neurons are organized into neural circuits using different connection types and how properties like synchronization emerge from circuit properties.
- Approaches to modeling larger brain areas as neural populations using techniques like neural fields to model mean firing rates over continuous space.
- Phenomena like neural coding, plasticity, learning and their role in computational models of behaviors and cognition. It provides examples of modeling visual attention, decision making and more
Statistical estimation and inference for large data sets require computationally efficient optimization methods. Remote sensing retrievals are, in fact, estimates of the underlying true state, and their optimization routines must necessarily make compromises in order to keep up with large data volumes. A sub-group of the Remote Sensing Working Group of the SAMSI Program on Mathematical and Statistical Methods for Climate and the Earth System is investigating how optimization in Bayesian-inspired retrievals and o_-line statistical methods could be made more computationally efficient. We will report on discussions held to-date and describe how progress in the theory of data systems research can positively impact optimization methodologies.
This research paper demonstrates the invention of the kinetic bands, based on Romanian mathematician and statistician Octav Onicescu’s kinetic energy, also known as “informational energy”, where we use historical data of foreign exchange currencies or indexes to predict the trend displayed by a stock or an index and whether it will go up or down in the future. Here, we explore the imperfections of the Bollinger Bands to determine a more sophisticated triplet of indicators that predict the future movement of prices in the Stock Market. An Extreme Gradient Boosting Modelling was conducted in Python using historical data set from Kaggle, the historical data set spanning all current 500 companies listed. An invariable importance feature was plotted. The results displayed that Kinetic Bands, derived from (KE) are very influential as features or technical indicators of stock market trends. Furthermore, experiments done through this invention provide tangible evidence of the empirical aspects of it. The machine learning code has low chances of error if all the proper procedures and coding are in play. The experiment samples are attached to this study for future references or scrutiny.
This document discusses neural coding and how neurons encode information during cognitive tasks. It summarizes research using single-unit neural recordings in rats performing a directional control task. Three key findings are presented:
1) Neural firing rates in areas like PM and M1 encoded information about trial outcomes and the rat's learning progress over multiple sessions.
2) Precise spike timing provided information about the rat's cognitive state, with more synchronized activity observed early in learning.
3) Network modeling revealed changes in functional connectivity between neurons associated with synaptic plasticity, with more optimized patterns emerging as the rat became proficient at the task.
L. Perivolaropoulos, Is There a Fundamental Cosmic Dipole?SEENET-MTP
This document discusses several puzzles in cosmological data that conflict with predictions from the standard Lambda Cold Dark Matter (ΛCDM) model of cosmology. These include large scale velocity flows, alignments of low cosmic microwave background multipoles, a dipole in measurements of the fine structure constant, and a possible dark energy dipole. The document proposes that a simple physical mechanism involving an off-center observer within a spherical dark energy inhomogeneity could provide a preferred axis and explain these observations. Specifically, it suggests that a topological quintessence model involving a Hubble-scale global monopole could generate the necessary dark energy inhomogeneity to account for the data.
The document discusses how computation can accelerate the generation of new knowledge by enabling large-scale collaborative research and extracting insights from vast amounts of data. It provides examples from astronomy, physics simulations, and biomedical research where computation has allowed more data and researchers to be incorporated, advancing various fields more quickly over time. Computation allows for data sharing, analysis, and hypothesis generation at scales not previously possible.
Os Perfis dos Cientistas de Dados nos Estados UnidosThiago Mosqueiro
O documento discute perfis de cientistas de dados nos Estados Unidos, incluindo: (1) o ambiente de pesquisa e desenvolvimento em empresas é ágil e baseado em feedback contínuo, (2) cientistas de dados lidam com problemas ambíguos e focam em abordagens baseadas em dados, (3) existem diferentes perfis como analistas de dados, cientistas de dados, cientistas de pesquisa e cientistas aplicados.
Non-parametric Change Point Detection for Spike TrainsThiago Mosqueiro
Two techniques of non-parametric change point detection are applied to two different neuroscience datasets. In the first dataset, we show how the multivariate non-parametric change point detection can precisely estimate reaction times to input stimulation in the olfactory system using joint information of spike trains from several neurons. In the second example, we propose to analyze communication and sequence coding using change point formalism as a time segmentation of homogeneous pieces of information, revealing cues to elucidate directionality of the communication in electric fish. We are also sharing our software implementation Chapolins at GitHub.
This document discusses the physics of intelligence and computation by biological systems. It introduces a physics-based theory of computation based on two axioms: 1) the most basic operation is to distinguish something from nothing, and 2) computation must abide by causality. It describes how a cortical neuron can be reverse-engineered using this framework by modeling its inputs and outputs as codebooks and developing statistical distributions and decision rules to model its behavior. The goal is to develop a fundamental theory of computation that applies broadly to intelligent and biological systems based on quantifying causality and distinction.
How can we harness the Human Brain Project to maximize its future health a...SharpBrains
In early 2013, the European Union selected the Human Brain Project, coordinated by Lausanne’s Federal Institute of Technology (EPFL), as the recipient of over 1 billion euros/ 1.3 billion dollars over the next ten years. How can the research agenda of this major initiative, and closely related ones, be organized and augmented with partnerships with the private sector and cross-sector stakeholders? How can we start building brain heath innovation platforms and delivery systems at the intersection of neuroscience, IT, and engineering?
- Chair: Hilal Lashuel, Associate Professor at the Swiss Federal Institute of Technology-Lausanne (EPFL), YGL Class of 2012
- Sean Hill, co-Director of the Blue Brain Project and co-Director of Neuroinformatics in the Human Brain Project (HBP) at the Swiss Federal Institute of Technology-Lausanne (EPFL)
This session took place at the 2013 SharpBrains Virtual Summit: http://sharpbrains.com/summit-2013/agenda/
Computational Training for Domain Scientists & Data LiteracyJoshua Bloom
Data literacy connects the knowledge of deep concepts of statistics, computer science, visualization, and domain science with the practical understanding of the data---and the stories it tells---that we encounter in our lives. As data becomes more pervasive, we see teaching data literacy to students as part of a broad education as an imperative for 21st century. Learning how to arm the next generation with the tools to make the most of data, and avoid the common pitfalls of data, will be a major thrust of our efforts with the Moore/Sloan initiative at Berkeley, through the Berkeley Institute for Data Science.
This curriculum vitae summarizes Mark V. Albert's educational and professional background. He received a Ph.D. in Computational Biology from Cornell University, where he researched neural development and coding. His work has included postdoctoral research at Northwestern University, research consulting, and graduate studies at Carnegie Mellon University focusing on computational neuroscience. He has received several honors and awards for his academic and research achievements.
Combining density functional theory calculations, supercomputing, and data-dr...Anubhav Jain
The document summarizes how computational materials science using density functional theory (DFT) calculations, supercomputing, and data-driven methods can help design new materials faster than traditional experimental approaches. It describes how high-throughput DFT calculations are run on supercomputers to screen large numbers of potential materials. The results are compiled in open databases like the Materials Project to be shared and reused by researchers. While computational limitations remain, combining computation and data is helping accelerate the discovery of new materials with improved properties for applications like batteries, thermoelectrics, and carbon capture.
Edited and revised: Overview of the international and interdisciplinary Gordon Research Conference on Visualization in Science and Education and info on key cognitive science and other visualization researchers. History of the conference, NSF workshop, and research on learning with visualizations.
Complex systems are characterized by constituents -- from neurons in the brain to individuals in a social network -- which exhibit special structural organization and nonlinear dynamics. As a consequence, a complex system cannot be understood by studying its units separately because their interactions lead to unexpected emerging phenomena, from collective behavior to phase transitions.
Recently, we have discovered that a new level of complexity characterizes a variety of natural and artificial systems, where units interact, simultaneously, in distinct ways. For instance, this is the case of multimodal transportation systems (e.g., metro, bus and train networks) or of biological molecules, whose interactions might be of different type (e.g. physical, chemical, genetic) or functionality (e.g., regulatory, inhibitory, etc.). The unprecedented newfound wealth of multivariate data allows to categorize system's interdependency by defining distinct "layers", each one encoding a different network representation of the system. The result is a multilayer network model.
Analyzing data from different domains -- including molecular biology, neuroscience, urban transport, telecommunications -- we will show that neglecting or disregarding multivariate information might lead to poor results. Conversely, multilayer models provide a suitable framework for complex data analytics, allowing to quantify the resilience of a system to perturbations (e.g., localized failures or targeted attacks), improving forecasting of spreading processes and accuracy in classification problems.
Feature Extraction and Classification of NIRS DataPritam Mondal
A thesis paper submitted to the department of Electronics and Communication
Engineering, Khulna University of Engineering & Technology, Khulna, Bangladesh, in
partial fulfillment of the requirement for the degree of “Bachelor of Science” in Electronics
and Communication Engineering
Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measur...sonix022
The document discusses theoretical limits on recovering tree-sparse signals from noisy linear measurements. It presents a simple adaptive sensing algorithm that can exactly recover the support of a tree-sparse signal using a minimal number of measurements. The algorithm operates by adaptively selecting measurement locations based on previous measurements. The document proves that this algorithm can recover the support if the nonzero coefficients are sufficiently large, and poses the open question of whether any algorithm can achieve support recovery with significantly smaller coefficients. It places this work in the context of establishing fundamental limits on exact support recovery for both structured and unstructured sparse signals under both non-adaptive and adaptive sensing strategies.
Materials discovery through theory, computation, and machine learningAnubhav Jain
The document discusses using theory, computation, and machine learning to discover new materials. It summarizes that density functional theory (DFT) can model material properties from first principles, and how DFT calculations have been automated and run on supercomputers to enable high-throughput screening of materials. Examples are given of computations predicting new materials that were later experimentally confirmed, like sidorenkite cathodes for sodium ion batteries. Related projects are outlined like the open-source Materials Project database of DFT data on over 85,000 materials and software libraries to support high-throughput computation and materials science. Text mining of scientific literature is also discussed to help predict new materials in advance.
Distance oracle - Truy vấn nhanh khoảng cách giữa hai điểm bất kỳ trên đồ thịHong Ong
Bài review cách tính nhanh khoảng cách giữa hai điểm bất kỳ trên đồ thị. Ứng dụng trong nhiều lĩnh vực như: telecome, internet routing, social network analysis, etc.
In the mid-1990s, the high-energy physics community (think FermiLab and CERN) started planning for the Large Hadron Collider. Managing the petabytes of data that would be generated by the facility and sharing it with the globally distributed community of over 10,000 researchers would be a major infrastructure and technology problem. This same community that brought us the web has now developed standards, software, and infrastructure for grid computing. In this seminar I'll present some of the exciting science that is being done on the Open Science Grid, the US national cyberinfrastructure linking 60 institutions (Harvard included) into a massive distributed computing and data processing system.
This document provides an overview of computational neuroscience from modeling single neurons to neural circuits and behavior. It discusses:
- Models of single neurons from the Hodgkin-Huxley model to reduced models like FitzHugh-Nagumo and Izhikevich neurons.
- How neurons are organized into neural circuits using different connection types and how properties like synchronization emerge from circuit properties.
- Approaches to modeling larger brain areas as neural populations using techniques like neural fields to model mean firing rates over continuous space.
- Phenomena like neural coding, plasticity, learning and their role in computational models of behaviors and cognition. It provides examples of modeling visual attention, decision making and more
Statistical estimation and inference for large data sets require computationally efficient optimization methods. Remote sensing retrievals are, in fact, estimates of the underlying true state, and their optimization routines must necessarily make compromises in order to keep up with large data volumes. A sub-group of the Remote Sensing Working Group of the SAMSI Program on Mathematical and Statistical Methods for Climate and the Earth System is investigating how optimization in Bayesian-inspired retrievals and o_-line statistical methods could be made more computationally efficient. We will report on discussions held to-date and describe how progress in the theory of data systems research can positively impact optimization methodologies.
This research paper demonstrates the invention of the kinetic bands, based on Romanian mathematician and statistician Octav Onicescu’s kinetic energy, also known as “informational energy”, where we use historical data of foreign exchange currencies or indexes to predict the trend displayed by a stock or an index and whether it will go up or down in the future. Here, we explore the imperfections of the Bollinger Bands to determine a more sophisticated triplet of indicators that predict the future movement of prices in the Stock Market. An Extreme Gradient Boosting Modelling was conducted in Python using historical data set from Kaggle, the historical data set spanning all current 500 companies listed. An invariable importance feature was plotted. The results displayed that Kinetic Bands, derived from (KE) are very influential as features or technical indicators of stock market trends. Furthermore, experiments done through this invention provide tangible evidence of the empirical aspects of it. The machine learning code has low chances of error if all the proper procedures and coding are in play. The experiment samples are attached to this study for future references or scrutiny.
This document discusses neural coding and how neurons encode information during cognitive tasks. It summarizes research using single-unit neural recordings in rats performing a directional control task. Three key findings are presented:
1) Neural firing rates in areas like PM and M1 encoded information about trial outcomes and the rat's learning progress over multiple sessions.
2) Precise spike timing provided information about the rat's cognitive state, with more synchronized activity observed early in learning.
3) Network modeling revealed changes in functional connectivity between neurons associated with synaptic plasticity, with more optimized patterns emerging as the rat became proficient at the task.
L. Perivolaropoulos, Is There a Fundamental Cosmic Dipole?SEENET-MTP
This document discusses several puzzles in cosmological data that conflict with predictions from the standard Lambda Cold Dark Matter (ΛCDM) model of cosmology. These include large scale velocity flows, alignments of low cosmic microwave background multipoles, a dipole in measurements of the fine structure constant, and a possible dark energy dipole. The document proposes that a simple physical mechanism involving an off-center observer within a spherical dark energy inhomogeneity could provide a preferred axis and explain these observations. Specifically, it suggests that a topological quintessence model involving a Hubble-scale global monopole could generate the necessary dark energy inhomogeneity to account for the data.
The document discusses how computation can accelerate the generation of new knowledge by enabling large-scale collaborative research and extracting insights from vast amounts of data. It provides examples from astronomy, physics simulations, and biomedical research where computation has allowed more data and researchers to be incorporated, advancing various fields more quickly over time. Computation allows for data sharing, analysis, and hypothesis generation at scales not previously possible.
Os Perfis dos Cientistas de Dados nos Estados UnidosThiago Mosqueiro
O documento discute perfis de cientistas de dados nos Estados Unidos, incluindo: (1) o ambiente de pesquisa e desenvolvimento em empresas é ágil e baseado em feedback contínuo, (2) cientistas de dados lidam com problemas ambíguos e focam em abordagens baseadas em dados, (3) existem diferentes perfis como analistas de dados, cientistas de dados, cientistas de pesquisa e cientistas aplicados.
Non-parametric Change Point Detection for Spike TrainsThiago Mosqueiro
Two techniques of non-parametric change point detection are applied to two different neuroscience datasets. In the first dataset, we show how the multivariate non-parametric change point detection can precisely estimate reaction times to input stimulation in the olfactory system using joint information of spike trains from several neurons. In the second example, we propose to analyze communication and sequence coding using change point formalism as a time segmentation of homogeneous pieces of information, revealing cues to elucidate directionality of the communication in electric fish. We are also sharing our software implementation Chapolins at GitHub.
O documento discute o uso do LATEX e do BibTeX para a formatação de teses e dissertações no IFSC, apresentando os principais comandos e estrutura dos arquivos. É proposto o acordo em um estilo padrão para as teses a fim de tornar o processo mais ágil. É explicado também o funcionamento do BibTeX para incorporação de referências bibliográficas seguindo o estilo escolhido.
A brief review of basics in Statistics and Classical Inference. This talk was given to a very specific public, interested in seeing how Statistics can be employed step-by-step. Especially, Maximum Likelihood estimators are discussed and applied to three simple data sets as a way to fit your probabilistic.
Flutuações e Estatísticas: Estudo sobre o Decaimento RadioativoThiago Mosqueiro
Workshop apresentado dia 4 de Julho. Recebemos alguns elogios e algumas críticas: realmente aprendemos muito. Para mim, foi a experiência mais instrutiva e educativa que tive em meu semestre.
1. Information dynamics
in the Kinouchi-Copelli model
T. S. Mosqueiro and Leonardo P. Maia
Instituto de F´ısica de S˜ao Carlos
Universidade de S˜ao Paulo
thiago.mosqueiro@gmail.com
lpmaia@ifsc.usp.br
May 16, 2012
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 1 / 13
2. Neuronal avalanches
* Experiments revealed power-law
distributions for both duration
and size of bursts of activity.
Beggs and Plenz. J. Neuroscience, v. 23 p. 11167 (2003)
Shew et al. J. Neuroscience, v. 31 p. 55 (2011)
Key questions:
Criticality in neurodynamics?
Critical optimization of information processing?
(“Edge of chaos”)
Psychophysics?
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 2 / 13
3. Criticality in neural systems
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 3 / 13
4. Our aim here
• Illustrate connections between neural dynamics and psychophysics
• Argue that information efficiency outperforms information capacity
• Discuss evidence of criticality w/o power-laws
• Preliminary results: quantify information flow leading to psychophysics
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 4 / 13
5. Kinouchi–Copelli model
Optimal dynamic range → critical optimization
• N neurons as nodes of a weighted
undirected random graph
• Weight matrix A
• Average connectivity: K.
• External stimulus r: rate of
Poisson process
Kinouchi and Copelli, Nature Physics, v. 2 p. 348-352 (2006)
* Xj(t) = 0: quiescent state
* Xj(t) = 1: excited state
* 2 ≤ Xj(t) ≤ m − 1: refractory
states
• Mean activity = time average of
excited fraction of the network
• sj =
k
Akj: local branching ratio
Average branching ratio: σ := sj
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 5 / 13
6. Kinouchi–Copelli model
Optimal dynamic range → critical optimization
• N neurons as nodes of a weighted
undirected random graph
• Weight matrix A
• Average connectivity: K.
• External stimulus r: rate of
Poisson process
Kinouchi and Copelli, Nature Physics, v. 2 p. 348-352 (2006)
* Xj(t) = 0: quiescent state
* Xj(t) = 1: excited state
* 2 ≤ Xj(t) ≤ m − 1: refractory
states
• Mean activity = time average of
excited fraction of the network
• sj =
k
Akj: local branching ratio
Average branching ratio: σ := sj
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 5 / 13
7. Optimal channel efficiency
Preprint: arXiv:1204.0751v1 [physics.bio-ph]
Erd˝os-R´enyi topology Barab´asi-Albert topology
Dynamic range is optimized concomitantly with
information efficiency encoded in avalanche lifetimes
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 6 / 13
8. Order parameter: spontaneous activity
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 7 / 13
9. No power-laws?
• Dehghani et al: arxiv 1203.0738v2
• Friedman et al: to appear in PRL
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 8 / 13
10. Local information dynamics
Hamming distance
• Let A = (X1(0), X2(0), X3(0), . . .)
• Define B by flipping a randomly chosen node
• δ =
1
N
N
j=1
|Aj − Bj |
Entropy rate
• Let’s define Hk(Xj ) =
−log [P {Xj (k), Xj (k − 1), . . . , Xj (0)}])
• H(X) = lim
k→∞
Hk(X)
k
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 9 / 13
11. Preliminary reports
• Performed simulations: N = 104
and K = 10
• Erd˝os-R´enyi and Barab´asi-Albert topologies
• Sampling ∼ 5 × 103
events with k ∼ 10.
• Criticality vs Local information dynamics?
Erd˝os-R´enyi topology Erd˝os-R´enyi topology
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 10 / 13
12. Dynamic range and the entropy rate
Preliminary reports
Reflects somehow the dynamic range optimization
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 11 / 13
13. Hamming distance and a transition
Preliminary reports
Erd˝os-R´enyi topology Barab´asi-Albert topology
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 12 / 13
14. The end?
Lots of work to characterize what optimizes information efficiency!
• This work is under financial support of CAPES and FAPESP.
• Acknowledgements to John Beggs.
Mosqueiro and Maia (IFSC - USP) Information dynamics in KC model 12th ECC – Ann Arbor, 2012 13 / 13