Analysis of community behavior and its interactions within and without (e.g., with other communities, civil and industrial engineered systems, organizations, governments, etc.) is a critical topic in a diverse variety of domains, from sociology and psychology to marketing science, security analytics, defense operations, political sciences, and other fields. Viewing a community as an engineered system allows the researcher to separate metrics characterizing the behavior of the community as a whole from metrics describing activities within it. One of the fundamental parameters of a community is its resilience. There are several accepted definitions of community resilience; however, translating them into practically applicable mathematical terms is a non-trivial task, due to the difficulties in implementation of such definitions. In this paper, we mathematically derive an applicable metric of community resilience. We further demonstrate how the metric can be estimated iteratively in a Bayesian process. Due to the specifics of community dynamics, implementation of Bayesian correction to metric estimates with real community data is a slow process, as intervals of time between community-affecting events in the real world are usually long (from months to years), while available measurements of community metrics that can be translated into state variables are often excessively aggregated. This limits their usefulness. For these reasons, we use a simulation of community population changes in response to changes in the sentiment of social and public media to demonstrate practical calculation of the proposed metric.
Massively Parallel Simulations of Spread of Infectious Diseases over Realisti...Subhajit Sahu
Highlighted notes while preparing for project on Computational Epidemics:
Massively Parallel Simulations of Spread of Infectious Diseases over Realistic Social Networks
Abhinav Bhatele, Jae-Seung Yeom, Nikhil Jain, Chris J. Kuhlman, Yarden Livnat, Keith R. Bisset, Laxmikant V. Kale, Madhav V. Marathe
Controlling the spread of infectious diseases in large populations is an important societal challenge. Mathematically, the problem is best captured as a certain class of reactiondiffusion processes (referred to as contagion processes) over appropriate synthesized interaction networks. Agent-based models have been successfully used in the recent past to study such contagion processes. We describe EpiSimdemics, a highly scalable, parallel code written in Charm++ that uses agent-based modeling to simulate disease spreads over large, realistic, co-evolving interaction networks. We present a new parallel implementation of EpiSimdemics that achieves unprecedented strong and weak scaling on different architectures — Blue Waters, Cori and Mira. EpiSimdemics achieves five times greater speedup than the second fastest parallel code in this field. This unprecedented scaling is an important step to support the long term vision of realtime epidemic science. Finally, we demonstrate the capabilities of EpiSimdemics by simulating the spread of influenza over a realistic synthetic social contact network spanning the continental United States (∼280 million nodes and 5.8 billion social contacts).
Massively Parallel Simulations of Spread of Infectious Diseases over Realisti...Subhajit Sahu
Highlighted notes while studying for project work:
Massively Parallel Simulations of Spread of Infectious Diseases over Realistic Social Networks
Abhinav Bhatele†
Jae-Seung Yeom†
Nikhil Jain†
Chris J. Kuhlman∗
Yarden Livnat‡
Keith R. Bisset∗
Laxmikant V. Kale§
Madhav V. Marathe∗
†Center for Applied Scientific Computing, Lawrence Livermore National Laboratory, Livermore, California 94551 USA
∗Biocomplexity Institute & Department of Computer Science, Virginia Tech, Blacksburg, Virginia 24061 USA
‡Scientific Computing and Imaging Institute, University of Utah, Salt Lake City, Utah 84112 USA
§Department of Computer Science, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 USA
E-mail: †{bhatele, yeom2, nikhil}@llnl.gov, ∗{ckuhlman, kbisset, mmarathe}@vbi.vt.edu
Abstract—Controlling the spread of infectious diseases in large populations is an important societal challenge. Mathematically, the problem is best captured as a certain class of reactiondiffusion processes (referred to as contagion processes) over appropriate synthesized interaction networks. Agent-based models have been successfully used in the recent past to study such contagion processes. We describe EpiSimdemics, a highly scalable, parallel code written in Charm++ that uses agent-based modeling to simulate disease spreads over large, realistic, co-evolving interaction networks. We present a new parallel implementation of EpiSimdemics that achieves unprecedented strong and weak scaling on different architectures — Blue Waters, Cori and Mira. EpiSimdemics achieves five times greater speedup than the second fastest parallel code in this field. This unprecedented scaling is an important step to support the long term vision of realtime epidemic science. Finally, we demonstrate the capabilities of EpiSimdemics by simulating the spread of influenza over a realistic synthetic social contact network spanning the continental United States (∼280 million nodes and 5.8 billion social contacts).
Presentation by U. Devrim Demirel, CBO's Fiscal Policy Studies Unit Chief, and James Otterson at the 28th International Conference of The Society for Computational Economics.
The COVID-19 pandemic is an urgent call for rethinking our collective social-ecological and socio-technical systems. In this free webinar, I speak about how the framework of Mindfulness Engineering can provide answer to some of the current challenges that the coronavirus have imposed on global systems
This paper is a methodological exercices presenting the results obtained from the estimation of the growth convergence equation using different methodologies.
A dynamic balanced panel data is estimated using: OLS, WithinGroup, HsiaoAnderson, First Difference, GMM with endogenous and GMM with predetermined instruments. An unbalanced panel is also realized for OLS, WG and FD.
Results are discused in light of Monte Carlo studies.
The role of events in simulation modelingWeibull AS
The need for assessing the impact of events with binaryi outcomes, like loan defaults, occurrence of
recessions, passage of a special legislation, etc., or events that can be treated like binary events like
paradigm shifts in consumer habits, changes in competitor behavior or new innovations, arises often
in economics and other areas of decision making.
By using analogies from intervention analysis a number of interesting and important issues can be
analyzed:
If two events affects one response variable will the combined effect be less or greater than the sum of both?
Will one event affecting more than one response variable increase the effect dramatically?
Is there a risk of calculating the same cost twice?
If an event occurs at the end of a project, will it be prolonged? And what will the costs be?
Questions like this can never be analyzed when using a ‘second layer lump sum’ approach. Even
more important is possibility to incorporate the responses to exogenous events inside the simulation
model, thus having the responses at the correct point on the time line and by that a correct net
present value for costs, revenues and company or project value.
Massively Parallel Simulations of Spread of Infectious Diseases over Realisti...Subhajit Sahu
Highlighted notes while preparing for project on Computational Epidemics:
Massively Parallel Simulations of Spread of Infectious Diseases over Realistic Social Networks
Abhinav Bhatele, Jae-Seung Yeom, Nikhil Jain, Chris J. Kuhlman, Yarden Livnat, Keith R. Bisset, Laxmikant V. Kale, Madhav V. Marathe
Controlling the spread of infectious diseases in large populations is an important societal challenge. Mathematically, the problem is best captured as a certain class of reactiondiffusion processes (referred to as contagion processes) over appropriate synthesized interaction networks. Agent-based models have been successfully used in the recent past to study such contagion processes. We describe EpiSimdemics, a highly scalable, parallel code written in Charm++ that uses agent-based modeling to simulate disease spreads over large, realistic, co-evolving interaction networks. We present a new parallel implementation of EpiSimdemics that achieves unprecedented strong and weak scaling on different architectures — Blue Waters, Cori and Mira. EpiSimdemics achieves five times greater speedup than the second fastest parallel code in this field. This unprecedented scaling is an important step to support the long term vision of realtime epidemic science. Finally, we demonstrate the capabilities of EpiSimdemics by simulating the spread of influenza over a realistic synthetic social contact network spanning the continental United States (∼280 million nodes and 5.8 billion social contacts).
Massively Parallel Simulations of Spread of Infectious Diseases over Realisti...Subhajit Sahu
Highlighted notes while studying for project work:
Massively Parallel Simulations of Spread of Infectious Diseases over Realistic Social Networks
Abhinav Bhatele†
Jae-Seung Yeom†
Nikhil Jain†
Chris J. Kuhlman∗
Yarden Livnat‡
Keith R. Bisset∗
Laxmikant V. Kale§
Madhav V. Marathe∗
†Center for Applied Scientific Computing, Lawrence Livermore National Laboratory, Livermore, California 94551 USA
∗Biocomplexity Institute & Department of Computer Science, Virginia Tech, Blacksburg, Virginia 24061 USA
‡Scientific Computing and Imaging Institute, University of Utah, Salt Lake City, Utah 84112 USA
§Department of Computer Science, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 USA
E-mail: †{bhatele, yeom2, nikhil}@llnl.gov, ∗{ckuhlman, kbisset, mmarathe}@vbi.vt.edu
Abstract—Controlling the spread of infectious diseases in large populations is an important societal challenge. Mathematically, the problem is best captured as a certain class of reactiondiffusion processes (referred to as contagion processes) over appropriate synthesized interaction networks. Agent-based models have been successfully used in the recent past to study such contagion processes. We describe EpiSimdemics, a highly scalable, parallel code written in Charm++ that uses agent-based modeling to simulate disease spreads over large, realistic, co-evolving interaction networks. We present a new parallel implementation of EpiSimdemics that achieves unprecedented strong and weak scaling on different architectures — Blue Waters, Cori and Mira. EpiSimdemics achieves five times greater speedup than the second fastest parallel code in this field. This unprecedented scaling is an important step to support the long term vision of realtime epidemic science. Finally, we demonstrate the capabilities of EpiSimdemics by simulating the spread of influenza over a realistic synthetic social contact network spanning the continental United States (∼280 million nodes and 5.8 billion social contacts).
Presentation by U. Devrim Demirel, CBO's Fiscal Policy Studies Unit Chief, and James Otterson at the 28th International Conference of The Society for Computational Economics.
The COVID-19 pandemic is an urgent call for rethinking our collective social-ecological and socio-technical systems. In this free webinar, I speak about how the framework of Mindfulness Engineering can provide answer to some of the current challenges that the coronavirus have imposed on global systems
This paper is a methodological exercices presenting the results obtained from the estimation of the growth convergence equation using different methodologies.
A dynamic balanced panel data is estimated using: OLS, WithinGroup, HsiaoAnderson, First Difference, GMM with endogenous and GMM with predetermined instruments. An unbalanced panel is also realized for OLS, WG and FD.
Results are discused in light of Monte Carlo studies.
The role of events in simulation modelingWeibull AS
The need for assessing the impact of events with binaryi outcomes, like loan defaults, occurrence of
recessions, passage of a special legislation, etc., or events that can be treated like binary events like
paradigm shifts in consumer habits, changes in competitor behavior or new innovations, arises often
in economics and other areas of decision making.
By using analogies from intervention analysis a number of interesting and important issues can be
analyzed:
If two events affects one response variable will the combined effect be less or greater than the sum of both?
Will one event affecting more than one response variable increase the effect dramatically?
Is there a risk of calculating the same cost twice?
If an event occurs at the end of a project, will it be prolonged? And what will the costs be?
Questions like this can never be analyzed when using a ‘second layer lump sum’ approach. Even
more important is possibility to incorporate the responses to exogenous events inside the simulation
model, thus having the responses at the correct point on the time line and by that a correct net
present value for costs, revenues and company or project value.
Assetdyne Proxies and Measures of SustainabilityDatonix.it
Sustainability – New Quantitative Approaches,
New Indices and Innovative ways to Preserving Value in a Turbulent Economy. New tools for complexity and entropy-based proxies and measurements of sustainability
sadasdasdasd asd asd equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal g
a template for publishing with the most amazing paper ever never ever yes and this is the really most amazing paper you haev e ver seen and you know why Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day T
We have proposed a new form of growth rate for population ecology. Generally, the growth rate is
dependent on the size of the population at that particular epoch. We have introduced an alternative
time-dependent form of growth rate. This form satisfies essential conditions to represent population
growth and can be an alternative form for growth models to analyze population dynamics. We have
employed the generalized Richards model as a guideline to compare our results. Further, we have
applied our model in the case of epidemics. To check the efficacy of our model, we have verified the
2003 SARS data. This model has estimated the final epidemic size with good accuracy. Thereafter,
we intend to describe the present COVID-2019 pandemic. We have performed our analysis with
data for Italy and Germany. Following, we have tried to predict the number of COVID-19
cases and the turning point for the USA and India.
Disasters and Resilience: Issues and PerspectivesOSU_Superfund
PREPARED BY: Nina Lam, Professor LSU Environmental Sciences January 29, 2013
More information on symposium: http://superfund.oregonstate.edu/LSUSymposium1.13
Exponential Growth: Case Studies for Sustainability EducationToni Menninger
Understanding exponential growth is of critical importance in sustainability, resource conservation, and economics. This work contains a collection of practice problems and realistic case studies developed for the teaching of sustainability science and conservation, with an emphasis on learning and applying the concepts of exponential growth. The exercises are designed to foster quantitative competence (numeracy) as well as critical thinking and systems thinking. Students learn to work with tools such as spreadsheet software and online databases and practice the application of basic but powerful quantitative analyses techniques. The case studies are based on recent, high quality data and explore questions of high relevance for the study and application of sustainability science.
This work is related to the Growth in a finite world presentation (http://www.slideshare.net/amenning/growth-in-a-finite-world-sustainability-and-the-exponential-function).
The full article is available here: https://www.researchgate.net/publication/320408185_Quantitative_Framework_to_Assess_Resilience_and_Risk_at_the_Country_Level
or here:
https://ascelibrary.org/doi/10.1061/AJRUA6.0000940
Abstract:
This research presents an analytical approach to assess the resilience of communities and states based on the Hyogo Framework for Action (HFA). The United Nations (UN) through their advancements in the Disaster Risk Reduction have released multiple international blueprints to help build the resilience of nations and communities, among which we mention the Hyogo Framework for Action and the Sendai Framework. The latter is still under development as the risk bases and the resilience indicators are yet to be defined. For this reason, the work presented here is built upon the more complete HFA framework. A number of weighted indicators taken from HFA are used to compute resilience. Those indicators, however, do not affect the resilience index equally. This discrepancy necessitates the need to weigh the indicators on the basis of their individual contribution towards resilience. In order to achieve this, we have used the Dependence Tree Analysis (DTA). This method allows identifying the dependencies between the HFA indicators and the resilience index and evaluate in unbiased way the weight factors of the different indicators.
The research is also proposing an analytic formulation to assess a new index, Bounce Back index (BBI), which combines both community’s Exposure, Hazard, and Resilience together. To illustrate the methodology in full details, a case study composed of 37 countries is presented in this research, where the Resilience and the Bounce Back indexes of each country are evaluated.
Diminishing Returns: When Should Real- world Surveys Stop Sampling?Inspirient
This talk was given on 18 July 2023 at the European Survey Research Association conference (ESRA'23) in Milan, Italy, by Dr. Georg Wittenburg (Inspirient) and Dr. Josef Hartmann (Kantar Public). We argue that for real-world surveys, for which the scope of deliverables is known before the survey starts, it is possible to reduce the required sample size by monitoring association and significance metrics while the data is being collected.
The full abstract of this talk is as follows (also available at https://www.europeansurveyresearch.org/conf2023/prog.php?sess=113#603):
It is intuitively clear that more can be learned from the initial survey interview of a sample than from the 10,000th interview: Assuming random sampling, the information contained in each new sample element only contributes in smaller and smaller increments to the information of all previously collected interviews. While it is a fact that – given a sufficiently high sample size – a significant result may eventually be achieved even for trivial relations, this point of view only partly describes the issue at hand, simply because in real-world surveys each additional sample element comes at a cost. The cost per sample element may be constant in simpler setups, but in practice this cost increases as hard-to-reach or hard-to-convince subpopulations may require additional effort. For real-world surveys, conducted under economic constraints, the question of when to stop sampling is thus very much worth revisiting.
The question of optimal sample size is commonly modelled as an a priori problem: The required sample size n is estimated for a given set of parameters, including effect size, significance level and statistical power (power analysis). Complementing this perspective, we propose to look at determining sample size as an adaptive problem, i.e., one that tracks effect size and significance metrics as sampled elements are coming in. We propose to observe the rate of convergence of these metrics while the survey is still in progress, and thus have the opportunity to stop as soon as saturation sets in. We have validated this approach on a number of real-world survey datasets and found that in some cases comparable results regarding effect size and overall significance levels could be reached with less than half of the number of cases actually taken. The results imply less respondents and thus, less respondent burden.
On the Dynamics of Machine Learning Algorithms and Behavioral Game TheoryRikiya Takahashi
Presentation Material used in guest lecturing at University of Tsukuba on September 17, 2016.
Target audience is part-time PhD student working at a machine learning, data mining, or agent-based simulation project.
Multiple Linear Regression Applications in Real Estate Pricinginventionjournals
In this paper, we attempt to predict the price of a real estate individual homes sold in North West Indiana based on the individual homes sold in 2014. The data/information is collected from realtor.com. The purpose of this paper is to predict the price of individual homes sold based on multiple regression model and also utilize SAS forecasting model and software. We also determine the factors influencing housing prices and to what extent they affect the price. Independent variables such square footage, number of bathrooms, and whether there is a finished basement,. and whether there is brick front or not and the type of home: Colonial, Cotemporary or Tudor. How much does each type of home (Colonial, Contemporary, Tudor) add to the price of the real estate
Multiple Linear Regression Applications in Real Estate Pricinginventionjournals
In this paper, we attempt to predict the price of a real estate individual homes sold in North West Indiana based on the individual homes sold in 2014. The data/information is collected from realtor.com. The purpose of this paper is to predict the price of individual homes sold based on multiple regression model and also utilize SAS forecasting model and software. We also determine the factors influencing housing prices and to what extent they affect the price. Independent variables such square footage, number of bathrooms, and whether there is a finished basement,. and whether there is brick front or not and the type of home: Colonial, Cotemporary or Tudor. How much does each type of home (Colonial, Contemporary, Tudor) add to the price of the real estate
BIG DATA AND BIG CITIES THE PROMISES AND LIMITATIONSOF IMPR.docxtangyechloe
BIG DATA AND BIG CITIES: THE PROMISES AND LIMITATIONS
OF IMPROVED MEASURES OF URBAN LIFE
EDWARD L. GLAESER, SCOTT DUKE KOMINERS, MICHAEL LUCA and NIKHIL NAIK∗
New, “big data” sources allow measurement of city characteristics and outcome
variables at higher collection frequencies and more granular geographic scales than
ever before. However, big data will not solve large urban social science questions
on its own. Big urban data has the most value for the study of cities when it allows
measurement of the previously opaque, or when it can be coupled with exogenous shocks
to people or place. We describe a number of new urban data sources and illustrate how
they can be used to improve the study and function of cities. We first show how Google
Street View images can be used to predict income in New York City, suggesting that
similar imagery data can be used to map wealth and poverty in previously unmeasured
areas of the developing world. We then discuss how survey techniques can be improved to
better measure willingness to pay for urban amenities. Finally, we explain how Internet
data is being used to improve the quality of city services. (JEL R1, C8, C18)
I. INTRODUCTION
Historically, most research on urban areas
has relied on coarse aggregate statistics and
smaller-scale surveys. Over the past decade,
∗The authors would like to acknowledge helpful com-
ments from Andy Caplin, William Kominers, Jonathan Smith,
and Mitchell Weiss. E.L.G. acknowledges support from the
Taubman Center for State and Local Government; S.D.K.
acknowledges support from the National Science Foundation
(grants CCF-1216095 and SES-1459912), the Harvard Mil-
ton Fund, the Ng Fund of the Harvard Center of Mathematical
Sciences and Applications, and the Human Capital and Eco-
nomic Opportunity Working Group (HCEO) sponsored by
the Institute for New Economic Thinking (INET); and N.N.
acknowledges support from The MIT Media Lab consortia.
Glaeser: Department of Economics, Harvard University,
Cambridge, MA 02138; John F. Kennedy School of Gov-
ernment, Harvard University, Cambridge, MA 02138;
National Bureau of Economic Research, Cambridge, MA
02138. Phone 617-496-2150, Fax 617-495-3817, E-mail
[email protected]
Kominers: Department of Economics, Harvard University,
Cambridge, MA 02138; Center of Mathematical Sciences
and Applications, Harvard University, Cambridge, MA
02138; Center for Research on Computation and Soci-
ety, Harvard University, Cambridge, MA 02138; Pro-
gram for Evolutionary Dynamics, Harvard University,
Cambridge, MA 02138; Entrepreneurial Management,
Harvard Business School, Boston, MA 02163; Soci-
ety of Fellows, Harvard University, Cambridge, MA
02138. Phone 617-495-8407, Fax 617-495-3817, E-mail
[email protected]
Luca: Negotiation, Organizations & Markets, Harvard Busi-
ness School, Boston, MA 02163. Phone 845-549-0372,
Fax 617-495-3817, E-mail [email protected]
Naik: Media Lab, Massachusetts Institute of Technology,
Cambridge, MA 02139. Phone 617-758-9.
Informs2020 using machine learning to identify the factors of people's mobi...Alex Gilgur
Mobility is an important metric in modeling of community population dynamics and community resilience. It is directly associated with the inorganic changes in a community during and after a disruption (e.g., city gentrification, refugee migration from a war zone, flash mobs in an online community, etc.). Mobility is driven by socioeconomic, demographic, geographical, psychological, and legal parameters. Not all of these parameters are mutually independent (orthogonal). For proper modeling, it is important to avoid collinearity, as otherwise the model will not generalize well. We discuss how machine learning can be used to avoid it by identifying the mutually orthogonal metrics (factors)
More Related Content
Similar to Measuring Community Resilience: a Bayesian Approach CESUN2018
Assetdyne Proxies and Measures of SustainabilityDatonix.it
Sustainability – New Quantitative Approaches,
New Indices and Innovative ways to Preserving Value in a Turbulent Economy. New tools for complexity and entropy-based proxies and measurements of sustainability
sadasdasdasd asd asd equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal g
a template for publishing with the most amazing paper ever never ever yes and this is the really most amazing paper you haev e ver seen and you know why Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...v wsef sdf equirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day Trace Elements : Requirement ...Minerals are essential for normal growth and maintenance of the body. Major elements : Requirement >100 mg /day T
We have proposed a new form of growth rate for population ecology. Generally, the growth rate is
dependent on the size of the population at that particular epoch. We have introduced an alternative
time-dependent form of growth rate. This form satisfies essential conditions to represent population
growth and can be an alternative form for growth models to analyze population dynamics. We have
employed the generalized Richards model as a guideline to compare our results. Further, we have
applied our model in the case of epidemics. To check the efficacy of our model, we have verified the
2003 SARS data. This model has estimated the final epidemic size with good accuracy. Thereafter,
we intend to describe the present COVID-2019 pandemic. We have performed our analysis with
data for Italy and Germany. Following, we have tried to predict the number of COVID-19
cases and the turning point for the USA and India.
Disasters and Resilience: Issues and PerspectivesOSU_Superfund
PREPARED BY: Nina Lam, Professor LSU Environmental Sciences January 29, 2013
More information on symposium: http://superfund.oregonstate.edu/LSUSymposium1.13
Exponential Growth: Case Studies for Sustainability EducationToni Menninger
Understanding exponential growth is of critical importance in sustainability, resource conservation, and economics. This work contains a collection of practice problems and realistic case studies developed for the teaching of sustainability science and conservation, with an emphasis on learning and applying the concepts of exponential growth. The exercises are designed to foster quantitative competence (numeracy) as well as critical thinking and systems thinking. Students learn to work with tools such as spreadsheet software and online databases and practice the application of basic but powerful quantitative analyses techniques. The case studies are based on recent, high quality data and explore questions of high relevance for the study and application of sustainability science.
This work is related to the Growth in a finite world presentation (http://www.slideshare.net/amenning/growth-in-a-finite-world-sustainability-and-the-exponential-function).
The full article is available here: https://www.researchgate.net/publication/320408185_Quantitative_Framework_to_Assess_Resilience_and_Risk_at_the_Country_Level
or here:
https://ascelibrary.org/doi/10.1061/AJRUA6.0000940
Abstract:
This research presents an analytical approach to assess the resilience of communities and states based on the Hyogo Framework for Action (HFA). The United Nations (UN) through their advancements in the Disaster Risk Reduction have released multiple international blueprints to help build the resilience of nations and communities, among which we mention the Hyogo Framework for Action and the Sendai Framework. The latter is still under development as the risk bases and the resilience indicators are yet to be defined. For this reason, the work presented here is built upon the more complete HFA framework. A number of weighted indicators taken from HFA are used to compute resilience. Those indicators, however, do not affect the resilience index equally. This discrepancy necessitates the need to weigh the indicators on the basis of their individual contribution towards resilience. In order to achieve this, we have used the Dependence Tree Analysis (DTA). This method allows identifying the dependencies between the HFA indicators and the resilience index and evaluate in unbiased way the weight factors of the different indicators.
The research is also proposing an analytic formulation to assess a new index, Bounce Back index (BBI), which combines both community’s Exposure, Hazard, and Resilience together. To illustrate the methodology in full details, a case study composed of 37 countries is presented in this research, where the Resilience and the Bounce Back indexes of each country are evaluated.
Diminishing Returns: When Should Real- world Surveys Stop Sampling?Inspirient
This talk was given on 18 July 2023 at the European Survey Research Association conference (ESRA'23) in Milan, Italy, by Dr. Georg Wittenburg (Inspirient) and Dr. Josef Hartmann (Kantar Public). We argue that for real-world surveys, for which the scope of deliverables is known before the survey starts, it is possible to reduce the required sample size by monitoring association and significance metrics while the data is being collected.
The full abstract of this talk is as follows (also available at https://www.europeansurveyresearch.org/conf2023/prog.php?sess=113#603):
It is intuitively clear that more can be learned from the initial survey interview of a sample than from the 10,000th interview: Assuming random sampling, the information contained in each new sample element only contributes in smaller and smaller increments to the information of all previously collected interviews. While it is a fact that – given a sufficiently high sample size – a significant result may eventually be achieved even for trivial relations, this point of view only partly describes the issue at hand, simply because in real-world surveys each additional sample element comes at a cost. The cost per sample element may be constant in simpler setups, but in practice this cost increases as hard-to-reach or hard-to-convince subpopulations may require additional effort. For real-world surveys, conducted under economic constraints, the question of when to stop sampling is thus very much worth revisiting.
The question of optimal sample size is commonly modelled as an a priori problem: The required sample size n is estimated for a given set of parameters, including effect size, significance level and statistical power (power analysis). Complementing this perspective, we propose to look at determining sample size as an adaptive problem, i.e., one that tracks effect size and significance metrics as sampled elements are coming in. We propose to observe the rate of convergence of these metrics while the survey is still in progress, and thus have the opportunity to stop as soon as saturation sets in. We have validated this approach on a number of real-world survey datasets and found that in some cases comparable results regarding effect size and overall significance levels could be reached with less than half of the number of cases actually taken. The results imply less respondents and thus, less respondent burden.
On the Dynamics of Machine Learning Algorithms and Behavioral Game TheoryRikiya Takahashi
Presentation Material used in guest lecturing at University of Tsukuba on September 17, 2016.
Target audience is part-time PhD student working at a machine learning, data mining, or agent-based simulation project.
Multiple Linear Regression Applications in Real Estate Pricinginventionjournals
In this paper, we attempt to predict the price of a real estate individual homes sold in North West Indiana based on the individual homes sold in 2014. The data/information is collected from realtor.com. The purpose of this paper is to predict the price of individual homes sold based on multiple regression model and also utilize SAS forecasting model and software. We also determine the factors influencing housing prices and to what extent they affect the price. Independent variables such square footage, number of bathrooms, and whether there is a finished basement,. and whether there is brick front or not and the type of home: Colonial, Cotemporary or Tudor. How much does each type of home (Colonial, Contemporary, Tudor) add to the price of the real estate
Multiple Linear Regression Applications in Real Estate Pricinginventionjournals
In this paper, we attempt to predict the price of a real estate individual homes sold in North West Indiana based on the individual homes sold in 2014. The data/information is collected from realtor.com. The purpose of this paper is to predict the price of individual homes sold based on multiple regression model and also utilize SAS forecasting model and software. We also determine the factors influencing housing prices and to what extent they affect the price. Independent variables such square footage, number of bathrooms, and whether there is a finished basement,. and whether there is brick front or not and the type of home: Colonial, Cotemporary or Tudor. How much does each type of home (Colonial, Contemporary, Tudor) add to the price of the real estate
BIG DATA AND BIG CITIES THE PROMISES AND LIMITATIONSOF IMPR.docxtangyechloe
BIG DATA AND BIG CITIES: THE PROMISES AND LIMITATIONS
OF IMPROVED MEASURES OF URBAN LIFE
EDWARD L. GLAESER, SCOTT DUKE KOMINERS, MICHAEL LUCA and NIKHIL NAIK∗
New, “big data” sources allow measurement of city characteristics and outcome
variables at higher collection frequencies and more granular geographic scales than
ever before. However, big data will not solve large urban social science questions
on its own. Big urban data has the most value for the study of cities when it allows
measurement of the previously opaque, or when it can be coupled with exogenous shocks
to people or place. We describe a number of new urban data sources and illustrate how
they can be used to improve the study and function of cities. We first show how Google
Street View images can be used to predict income in New York City, suggesting that
similar imagery data can be used to map wealth and poverty in previously unmeasured
areas of the developing world. We then discuss how survey techniques can be improved to
better measure willingness to pay for urban amenities. Finally, we explain how Internet
data is being used to improve the quality of city services. (JEL R1, C8, C18)
I. INTRODUCTION
Historically, most research on urban areas
has relied on coarse aggregate statistics and
smaller-scale surveys. Over the past decade,
∗The authors would like to acknowledge helpful com-
ments from Andy Caplin, William Kominers, Jonathan Smith,
and Mitchell Weiss. E.L.G. acknowledges support from the
Taubman Center for State and Local Government; S.D.K.
acknowledges support from the National Science Foundation
(grants CCF-1216095 and SES-1459912), the Harvard Mil-
ton Fund, the Ng Fund of the Harvard Center of Mathematical
Sciences and Applications, and the Human Capital and Eco-
nomic Opportunity Working Group (HCEO) sponsored by
the Institute for New Economic Thinking (INET); and N.N.
acknowledges support from The MIT Media Lab consortia.
Glaeser: Department of Economics, Harvard University,
Cambridge, MA 02138; John F. Kennedy School of Gov-
ernment, Harvard University, Cambridge, MA 02138;
National Bureau of Economic Research, Cambridge, MA
02138. Phone 617-496-2150, Fax 617-495-3817, E-mail
[email protected]
Kominers: Department of Economics, Harvard University,
Cambridge, MA 02138; Center of Mathematical Sciences
and Applications, Harvard University, Cambridge, MA
02138; Center for Research on Computation and Soci-
ety, Harvard University, Cambridge, MA 02138; Pro-
gram for Evolutionary Dynamics, Harvard University,
Cambridge, MA 02138; Entrepreneurial Management,
Harvard Business School, Boston, MA 02163; Soci-
ety of Fellows, Harvard University, Cambridge, MA
02138. Phone 617-495-8407, Fax 617-495-3817, E-mail
[email protected]
Luca: Negotiation, Organizations & Markets, Harvard Busi-
ness School, Boston, MA 02163. Phone 845-549-0372,
Fax 617-495-3817, E-mail [email protected]
Naik: Media Lab, Massachusetts Institute of Technology,
Cambridge, MA 02139. Phone 617-758-9.
Similar to Measuring Community Resilience: a Bayesian Approach CESUN2018 (20)
Informs2020 using machine learning to identify the factors of people's mobi...Alex Gilgur
Mobility is an important metric in modeling of community population dynamics and community resilience. It is directly associated with the inorganic changes in a community during and after a disruption (e.g., city gentrification, refugee migration from a war zone, flash mobs in an online community, etc.). Mobility is driven by socioeconomic, demographic, geographical, psychological, and legal parameters. Not all of these parameters are mutually independent (orthogonal). For proper modeling, it is important to avoid collinearity, as otherwise the model will not generalize well. We discuss how machine learning can be used to avoid it by identifying the mutually orthogonal metrics (factors)
When forecasting the workload for capacity planning, there is always a "magic number" - the probability of not being underforecasted. Then comes the problem of forecasting with such probability. However, upper percentiles are where all the non-stationarity has its highest impact on the workload. In this presentation, we show an elegant way to overcome this and other issues without losing mathematical rigor.
This presentation talks about an elegant way to combine the strengths of regression and TSA forecasting to deliver better answers to capacity planning questions.
Statistical Process Control (SPC) is a well described framework used to identify weak points in any process and predict the probability of failure in it. The distribution parameters of process metrics have been translated into process capability, which evolved in the 1990s into the Six Sigma methodology in a number of incarnations. However, all techniques derived for SPC have two important weaknesses: they assume that the process metric is expected to be in a steady state and they assume that the process metric is normally distributed, or can be converted to a normal distribution. The concepts and ideas outlined in this paper make it possible to overcome these two shortcomings. Our methodology is a generalization of traditional SPC to nonstationary and non-Gaussian metrics. The techniques outlined in this paper have been developed and validated for the IT industry, but they can be easily translated into other domains.
When sizing any network capacity, several factors, such as Traffic, Quality of Service (QoS), and Total Cost of Ownership (TCO) are usually taken into account. Generally, it boils down to a joint minimization of cost and maximization of traffic subject to the constraints of protocol and QoS requirements. The stochastic nature of network traffic and the link saturation queueing issues add uncertainty to the already complex optimization problem. In this paper, we examine the sources of traffic demand variability and dive into Monte-Carlo methodology as an efficient way for solving these problems.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Introduction:
RNA interference (RNAi) or Post-Transcriptional Gene Silencing (PTGS) is an important biological process for modulating eukaryotic gene expression.
It is highly conserved process of posttranscriptional gene silencing by which double stranded RNA (dsRNA) causes sequence-specific degradation of mRNA sequences.
dsRNA-induced gene silencing (RNAi) is reported in a wide range of eukaryotes ranging from worms, insects, mammals and plants.
This process mediates resistance to both endogenous parasitic and exogenous pathogenic nucleic acids, and regulates the expression of protein-coding genes.
What are small ncRNAs?
micro RNA (miRNA)
short interfering RNA (siRNA)
Properties of small non-coding RNA:
Involved in silencing mRNA transcripts.
Called “small” because they are usually only about 21-24 nucleotides long.
Synthesized by first cutting up longer precursor sequences (like the 61nt one that Lee discovered).
Silence an mRNA by base pairing with some sequence on the mRNA.
Discovery of siRNA?
The first small RNA:
In 1993 Rosalind Lee (Victor Ambros lab) was studying a non- coding gene in C. elegans, lin-4, that was involved in silencing of another gene, lin-14, at the appropriate time in the
development of the worm C. elegans.
Two small transcripts of lin-4 (22nt and 61nt) were found to be complementary to a sequence in the 3' UTR of lin-14.
Because lin-4 encoded no protein, she deduced that it must be these transcripts that are causing the silencing by RNA-RNA interactions.
Types of RNAi ( non coding RNA)
MiRNA
Length (23-25 nt)
Trans acting
Binds with target MRNA in mismatch
Translation inhibition
Si RNA
Length 21 nt.
Cis acting
Bind with target Mrna in perfect complementary sequence
Piwi-RNA
Length ; 25 to 36 nt.
Expressed in Germ Cells
Regulates trnasposomes activity
MECHANISM OF RNAI:
First the double-stranded RNA teams up with a protein complex named Dicer, which cuts the long RNA into short pieces.
Then another protein complex called RISC (RNA-induced silencing complex) discards one of the two RNA strands.
The RISC-docked, single-stranded RNA then pairs with the homologous mRNA and destroys it.
THE RISC COMPLEX:
RISC is large(>500kD) RNA multi- protein Binding complex which triggers MRNA degradation in response to MRNA
Unwinding of double stranded Si RNA by ATP independent Helicase
Active component of RISC is Ago proteins( ENDONUCLEASE) which cleave target MRNA.
DICER: endonuclease (RNase Family III)
Argonaute: Central Component of the RNA-Induced Silencing Complex (RISC)
One strand of the dsRNA produced by Dicer is retained in the RISC complex in association with Argonaute
ARGONAUTE PROTEIN :
1.PAZ(PIWI/Argonaute/ Zwille)- Recognition of target MRNA
2.PIWI (p-element induced wimpy Testis)- breaks Phosphodiester bond of mRNA.)RNAse H activity.
MiRNA:
The Double-stranded RNAs are naturally produced in eukaryotic cells during development, and they have a key role in regulating gene expression .
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Measuring Community Resilience: a Bayesian Approach CESUN2018
1. Alexander Gilgur, Jose Emmanuel Ramirez-Marquez
School of Systems and Enterprises, SIT
CESUN2018 Global Conference
Tokyo, Japan. June, 2018
A Bayesian Approach
Measuring Community Resilience
The research performed by Jose E. Ramirez Marquez leading to these results has
received funding from the National Science Foundation, CRISP Type 2/Collaborative
Research: Resilience Analytics: A Data-Driven Approach for Enhanced
Interdependent Network Resilience, Award number 1541165.
3. 3
Introduction
After an adverse event,
Will community recover?
Will it recover to a new level?
Will it evolve into something new?
4. 4
Introduction
After an adverse event,
Will community recover?
Will it recover to a new level?
Will it evolve into something new?
How can we measure community resilience?
5. 5
Exploring Community Response Drivers
Example: my communities on LinkedIn (http://socilab.com/#home)
Community is a complex social network with multiple layers of
interactions among the nodes.
Nodes and Edges have characteristics (connectedness,
centrality; distance, strength) that impact community response
to adverse events.
Aggregate measures (uniformity, stationarity, distribution,
resilience) better describe the community as a whole.
State Variable depends on what is important.
6. 6
Exploring Community Response Drivers
Example: my communities on LinkedIn (http://socilab.com/#home)
Community is a complex social network with multiple layers
of interactions among the nodes.
Nodes and Edges have characteristics (connectedness,
centrality; distance, strength) that impact community
response to adverse events.
Aggregate measures (uniformity, stationarity, distribution,
resilience) better describe the community as a whole.
State Variable depends on what is important.
Traditional approach - Deductive: start with aggregate
measures and dive in to the root causes to explain behavior.
7. 7
Disaster Recovery Timeline
Regardless how we measure state of community,
adverse events cause disturbances in state variable.
Monotonic @ New LevelOscillatory @ Old Level Oscillatory @ New Level
8. 8
Disaster Recovery Timeline
After the change, state variable stabilizes at a new or at the old level.
Stabilization can take a few months to a few years.
Regardless how we measure state of community,
adverse events cause disturbances in state variable.
Monotonic @ New LevelOscillatory @ Old Level Oscillatory @ New Level
11. 11
Proposal
Linear regression coefficient is the sensitivity of recovery time to the severity of disturbing events.
Then community resilience metric is the inverse of the linear regression coefficient.
Measure community resilience as inverse of sensitivity of recovery time to severity of disturbing events
12. 12
Methodology
1. Develop a mathematical model of community population growth as a function of Media Positivity and Housing Prices.
2. Simulate disturbances in Housing Prices. These will lead to changes in Population Egress rate.
3. Shared Sentiment will change, leading to changes in Media Positivity.
4. Population Ingress will change accordingly, and Population will be disturbed.
5. Measure, for each disturbance:
○ New stable level of Population
○ Time to stabilization.
6. Fit a regression line.
7. Calculate MR
8. Record R2: it quantifies specificity of the metric
18. 18
Experiment
AnyLogic Simulation Dashboard
Population Growth Rate:
Equilibrium:
Population Ingress Rate
Population Egress Rate
= household sensitivity to media
= media positivity index
= time-dependent coefficient
= time-dependent coefficient
= expensive housing index
= household sensitivity to prices
𝐸 𝑃 𝑡 = 𝐵 𝑡 ∗ 𝑆 𝐻
𝑃
∗ 𝐻 𝐸(𝑡)
𝐼 𝑃 𝑡 = 𝐴 𝑡 ∗ 𝑆 𝐻
𝑀
∗ 𝑀+
(𝑡)
19. 19
Experimental Data Analysis
PopulationPositiveMediaPopulationGrowth
Severity
StabilizationTime[days]
Population Stabilization Time
Disturbances in Population
can be detected as outliers
in Media Positivity or in
Population Growth
Stabilization is achieved
when Population Growth
becomes zero
Regression Analysis
Data From Simulation
The high R2 points to high specificity of the metric:
93.1% of the variance in Stabilization Time is
explained linearly by the Severity of Disturbance.
Simulation Dashboard
20. 20
Conclusion and Further Work
Proposed a new metric of community resilience
Developed a bayesian method for calculating it
Developed a model where the metric can me measured
Demonstrated its usability via a simulation
Identify real communities where this metric can be applied
Quantify impact of media positivity on such communities
Quantify impact of cost of living on such communities
Apply this methodology to study their resilience
22. Alexander Gilgur, Jose E. Ramirez-Marquez
agilgur@stevens.edu, jmarquez@stevens.edu
+1(408)828-2115
Editor's Notes
Hello. My name is Alex Gilgur.
I am a 3rd-year PhD student at Stevens, working under the advisorship of Professor Dr. Ramirez-Marquez. I am honored to be here presenting this paper. If you find it useful, it will be the highest reward for me.
When I am not taking classes and not working on my research, I help keep 2 (now 2.2) billion people connected. I forecast FB network demand. We have many data centers, which communicate for thousands of services, and we need to know in advance how many terabits per second we will have 2-3-4-5 years from now.
Forecasted demand goes to a set of simulation+optimization tools, which tell us how vulnerable our network is under the forecasted load, and where we need to augment capacity and boost its resilience.
Multiple datacenters
Thousands of services;
Millions of servers, sometimes in different parts of the world.
All these services and products need to talk to each other in a variety of patterns. Things get complicated really fast.
What does this have to do with community resilience?
(From previous slide) –
-----------------------------------
Everything. There are similarities in structure and in questions that we are answering:
how big will the network be? (Tbps and # of nodes and links)
how vulnerable will it be?
how do we boost network resilience without affecting the users?
(From previous slide) –
-----------------------------------
And the first question we need to answer when it comes to boosting resilience is:
how do we benchmark network resilience?
How do we measure it?
Same thing for community.
This is an ego-centric view of my community on LinkedIn. I am sure you have similarly complex and complicated communities around you. I encourage you to give it a shot; just follow this link.
------------------------------------------------
Community is a network, and same questions apply to it.
Community can be characterized by a state variable (or a set of state variable),
which depends on what is important for the work we are doing.
It may be language distribution; education level; availability of jobs; population.
(From previous slide) –
------------------------------
I am taking the traditional top-down (holistic, deductive) approach: observe aggregate measures and then dive into the community response drivers: how do we preserve community or if it is a gang, how do we destroy it?
After a disturbance, the response dynamics varies from community to community.
Communities have something in common - at least communities with measured state variable - they may take a different trajectory, but unless they fall apart completely and stop being a community, they do stabilize after a disturbance, and we can use this property to quantify community resilience.
Can we quantify community resilience if a community only got hit once?
After San Francisco earthquake of 1906 the city was rebuilt in two years. A very good resilience: the city was destroyed and burned down, and yet in 1908 it was full of life and business activity (primarily banking and entertainment) at a level never seen before since the Gold Rush.
The 1989 earthquake was far less devastating, yet the city took a good 10 years to recover, and there still are neighborhoods in SF and Oakland that blame their misery on the earthquake. But the communities stabilized in about the same time, close to 2 -3 years.
These are only two events, 83 years apart, in the same geographical location. Are these different communities?
Or is it same community whose resilience changed? Did it change? If so, can we come up with a stable metric of community resilience?
After a disturbance, the response dynamics varies from community to community.
Note that in the case of an earthquake, or a hurricane, or another natural disaster, we can measure its power (e.g., Richter scale; hurricane category; etc.).
But other disturbances that do not have an objective measurement defined can affect the community.
We measure strength of disturbance by its effect on the state variable.
Details are in the paper. We will talk about it a few slides down the road.
What if we plot on the horizontal axis the strength of the disturbance, and on the vertical access, time to stabilization?
We can fit a line through these points, and the slope of this line, measured by its tangent, or the regression parameter, will tell us how sensitive the stabilization time is to the severity of the disturbance.
The steeper the slope, the longer it takes to recover after a small disturbance; so if we revert the computed regression parameter, we will have a metric of community resilience.
What if we plot on the horizontal axis the strength of the disturbance, and on the vertical access, time to stabilization?
We can fit a line through these points, and the slope of this line, measured by its tangent, or the regression parameter, will tell us how sensitive the stabilization time is to the severity of the disturbance.
The steeper the slope, the longer it takes to recover after a small disturbance; so if we revert the computed regression parameter, we will have a metric of community resilience.
What if we plot on the horizontal axis the strength of the disturbance, and on the vertical access, time to stabilization?
We can fit a line through these points, and the slope of this line, measured by its tangent, or the regression parameter, will tell us how sensitive the stabilization time is to the severity of the disturbance.
The steeper the slope, the longer it takes to recover after a small disturbance; so if we revert the computed regression parameter, we will have a metric of community resilience.
That was the preamble.
Now consider this very simple community model. We assume that we can quantify sensitivity to cost of living, media sensitivity, shared sentiment, etc.
Further, we assume that the media will report positively or negatively about this community. This is quantified as Media Positivity [0…1] Positive reporting will lead to more people coming in. Negativity of shared sentiment in the community (e.g., about housing prices) will lead to people leaving the area.
A change in media positivity will lead to a change in ingress rate, shifting the balance, and the community will grow or become smaller
Then we repeatedly turn any of the knobs, we introduce disturbances and can fit a regression line into the data and measure the response time sensitivity to disturbance severity.
We are measuring disturbance severity by the size of the change in the state variable (again, 1906 earthquake was devastating to the city infrastructure, but had relatively little effect on the community sentiment).
Response time is measured as the total time it takes to restabilize the community.
------------------------
With monotonic restabilization, we use Eq. (6) to measure the size of the disturbance.
Eq. (9) is the form of the model we are fitting, and (10) is the community resilience metric.
------------------------
With oscillatory restabilization, we propose using Eq. (12).
Eq. (9) is the form of the model we are fitting, and (10) is the community resilience metric.
We are measuring disturbance severity by the size of the change in the state variable (again, 1906 earthquake was devastating to the city infrastructure, but had relatively little effect on the community sentiment).
Response time is measured as the total time it takes to restabilize the community.
------------------------
This algorithm describes how it is done and how the model is adjusted as we collect more data.
And here it is all put together with the formulae.
We follow the Bayesian definition of regression process: adjust our prior assumptions based on new evidence. When the hit is only once, we can just divide the stabilization time by the size of the hit, and we are done; every new hit gets an adjustment in the regression line.
---------------------------
It will converge, for a number of reasons, not least of which is the Central Limit Theorem: from the community’s perspective, the disturbances happen randomly, and the community’s response to them can be treated as random sampling. The linear regression process is unbiased, meaning it draws the best-fitted line through the means of the distributions of Y for each X, and CLT states that the means of samples converge to the mean of the population.
We did an experimental analysis on a simulated community, collecting ingress, egress, media positivity while hitting the simulated community with different disturbances. Results are on the next slide.
Here the validity of the model is irrelevant: the goal was to build an engine that would allow us to produce disturbances, measuring a state variable and its stabilization time.
But as part of my dissertation research, I am now conducting an analysis of validity of this model: investigating the impact of news-media positivity on population growth.
So in the simulated experiment, we measured population (as the state variable); and computed population growth rate. Here it is very easy to see when the state variable (population) stabilizes after the disturbance.
These values and times were put into a regression model, and the value of the resilience metric for this community was obtained.
Note the high value of R^2 - it means the metric is very specific: disturbance severity, measured by the state variable, explains 93% of the variance in the stabilization time.
And that concludes my presentation. Next I am planning to study real communities, focusing on media sentiment and population growth during and after economic recessions in the Silicon Valley and San Francisco.
And that concludes my presentation. Next I am planning to study real communities, focusing on media sentiment and population growth during and after economic recessions in the Silicon Valley and San Francisco.