This document discusses sources of variability in response time sequences. It summarizes research showing increased intra-individual variability in response times among those with ADHD or autism compared to controls. It then describes several methods used to analyze response time variability, including variance analysis of response time distributions, fitting response time histograms to models like the ex-Gaussian, and time series analysis of response time sequences to examine trends, autocorrelation, and stationarity. Examples applying these methods to response time data from two experimental tasks are provided.
Enzymes are biological catalysts that speed up chemical reactions without being used up. They have an active site that substrates fit into perfectly through temporary bonds. Enzymes increase the rate of reactions by lowering their activation energy. The rate of enzyme-catalyzed reactions depends on factors like enzyme and substrate concentration, temperature, and pH. Enzyme activity can be inhibited through competitive or non-competitive inhibition.
Hormones act as biological regulators through three main levels - the nervous system, hormonal regulation, and intracellular enzymes. There are two main types of hormones - those produced by endocrine glands which enter the bloodstream, and local hormones which regulate tissues locally. Hormones regulate key processes like metabolism, digestion, and ion concentration in the body. They act through receptors on the surface of cells or inside cells, and trigger second messengers that lead to biological responses like protein synthesis. The hypothalamus and pituitary gland work together to regulate other endocrine glands and control numerous bodily functions through hormone release and feedback loops.
Pharmacokinetic variability occurs due to differences in drug absorption, distribution, metabolism, and excretion between individuals. Several factors influence this variability including age, body weight, pregnancy, and disease states. Drug metabolism and excretion are often lower in newborns compared to adults due to immature organ systems and lower enzyme levels. Plasma protein binding is also lower in newborns which increases drug distribution. Renal function is reduced in newborns leading to slower drug clearance. These developmental differences can increase the risk of adverse drug reactions in newborns if dosages are not appropriately adjusted.
The document discusses different types of transport mechanisms across cell membranes including:
1) Active transport which involves transport against a concentration gradient using cellular energy.
2) Passive transport mechanisms like diffusion, facilitated diffusion, osmosis, and dialysis/filtration which do not require energy.
3) Vesicular transport which uses vesicles to transport materials in or out of cells. Vesicular transport includes endocytosis and exocytosis.
Enzymes are proteins that act as biological catalysts, speeding up chemical reactions without being used up. They work via a "lock and key" mechanism, though the induced fit model suggests enzymes are flexible. Enzyme activity can be measured by looking at reaction rates under different temperatures, pH levels, enzyme and substrate concentrations. Heating an enzyme above its optimal temperature can cause it to denature, changing its shape so the active site no longer fits the substrate. Many enzymes are used industrially in washing powders, food processing, and antibiotic production.
Ch02 Drug Receptor Interactions And Pharmacodynamicsaxix
This document summarizes key concepts about drug-receptor interactions and pharmacodynamics from Chapter 2. It discusses how drugs produce effects by binding to receptors on cells and tissues. It defines terms like agonists, antagonists, affinity, efficacy and potency. It also describes different types of receptors that drugs can bind to, including G-protein coupled receptors and intracellular receptors. Finally, it explains the concepts of spare receptors, desensitization, and the different types of antagonism - competitive and noncompetitive.
Drug Receptors intercaction and Drug antagonism : Dr Rahul Kunkulol's Power p...Rahul Kunkulol
1. The document discusses various types of drug receptors and receptor superfamilies including ligand-gated ion channels, G-protein coupled receptors, kinase-linked receptors, and nuclear receptors.
2. It describes the mechanisms of drug-receptor interactions and signal transduction pathways involving second messengers like cAMP, IP3, DAG, Ca2+, and nitric oxide.
3. The concepts of agonism, antagonism, partial agonism, and theories of drug-receptor binding like the two-state model are explained. Different types of antagonism like competitive and non-competitive are also summarized.
This document outlines core concepts and learning objectives related to pharmacology of the gastrointestinal system. It includes 10 objectives about drugs used to treat peptic ulcer disease, chemotherapy-induced nausea and vomiting, constipation, diarrhea, and inflammatory bowel disease. It also provides a topical outline that will be covered, such as gastrointestinal disorders, liver diseases, nausea and vomiting, and pharmacology of the gastrointestinal tract.
Enzymes are biological catalysts that speed up chemical reactions without being used up. They have an active site that substrates fit into perfectly through temporary bonds. Enzymes increase the rate of reactions by lowering their activation energy. The rate of enzyme-catalyzed reactions depends on factors like enzyme and substrate concentration, temperature, and pH. Enzyme activity can be inhibited through competitive or non-competitive inhibition.
Hormones act as biological regulators through three main levels - the nervous system, hormonal regulation, and intracellular enzymes. There are two main types of hormones - those produced by endocrine glands which enter the bloodstream, and local hormones which regulate tissues locally. Hormones regulate key processes like metabolism, digestion, and ion concentration in the body. They act through receptors on the surface of cells or inside cells, and trigger second messengers that lead to biological responses like protein synthesis. The hypothalamus and pituitary gland work together to regulate other endocrine glands and control numerous bodily functions through hormone release and feedback loops.
Pharmacokinetic variability occurs due to differences in drug absorption, distribution, metabolism, and excretion between individuals. Several factors influence this variability including age, body weight, pregnancy, and disease states. Drug metabolism and excretion are often lower in newborns compared to adults due to immature organ systems and lower enzyme levels. Plasma protein binding is also lower in newborns which increases drug distribution. Renal function is reduced in newborns leading to slower drug clearance. These developmental differences can increase the risk of adverse drug reactions in newborns if dosages are not appropriately adjusted.
The document discusses different types of transport mechanisms across cell membranes including:
1) Active transport which involves transport against a concentration gradient using cellular energy.
2) Passive transport mechanisms like diffusion, facilitated diffusion, osmosis, and dialysis/filtration which do not require energy.
3) Vesicular transport which uses vesicles to transport materials in or out of cells. Vesicular transport includes endocytosis and exocytosis.
Enzymes are proteins that act as biological catalysts, speeding up chemical reactions without being used up. They work via a "lock and key" mechanism, though the induced fit model suggests enzymes are flexible. Enzyme activity can be measured by looking at reaction rates under different temperatures, pH levels, enzyme and substrate concentrations. Heating an enzyme above its optimal temperature can cause it to denature, changing its shape so the active site no longer fits the substrate. Many enzymes are used industrially in washing powders, food processing, and antibiotic production.
Ch02 Drug Receptor Interactions And Pharmacodynamicsaxix
This document summarizes key concepts about drug-receptor interactions and pharmacodynamics from Chapter 2. It discusses how drugs produce effects by binding to receptors on cells and tissues. It defines terms like agonists, antagonists, affinity, efficacy and potency. It also describes different types of receptors that drugs can bind to, including G-protein coupled receptors and intracellular receptors. Finally, it explains the concepts of spare receptors, desensitization, and the different types of antagonism - competitive and noncompetitive.
Drug Receptors intercaction and Drug antagonism : Dr Rahul Kunkulol's Power p...Rahul Kunkulol
1. The document discusses various types of drug receptors and receptor superfamilies including ligand-gated ion channels, G-protein coupled receptors, kinase-linked receptors, and nuclear receptors.
2. It describes the mechanisms of drug-receptor interactions and signal transduction pathways involving second messengers like cAMP, IP3, DAG, Ca2+, and nitric oxide.
3. The concepts of agonism, antagonism, partial agonism, and theories of drug-receptor binding like the two-state model are explained. Different types of antagonism like competitive and non-competitive are also summarized.
This document outlines core concepts and learning objectives related to pharmacology of the gastrointestinal system. It includes 10 objectives about drugs used to treat peptic ulcer disease, chemotherapy-induced nausea and vomiting, constipation, diarrhea, and inflammatory bowel disease. It also provides a topical outline that will be covered, such as gastrointestinal disorders, liver diseases, nausea and vomiting, and pharmacology of the gastrointestinal tract.
1. Drugs can interact with receptors, ion channels, enzymes, and carrier molecules in cells.
2. Receptor-mediated mechanisms involve drugs binding to receptors, forming drug-receptor complexes that trigger biological responses. Non-receptor mechanisms do not involve receptors.
3. There are different types of receptors and signal transduction pathways, including ionotropic receptors, G-protein coupled receptors, enzyme-linked receptors, and receptors regulating gene expression.
These formulations are used to treat dry eyes and contain polymers that adhere to the eye's mucus membrane. Bioadhesive polymers are able to prolong the release of drugs by binding to mucosal surfaces. The eye presents challenges for drug delivery due to its protective mechanisms, so bioadhesive formulations are designed to be comfortable and maintain drug concentrations in the eye. Examples given are Hypotears and Sno Tears eye drops, which contain polyvinyl alcohol to lubricate the eyes and increase tear production.
1. Dose response relationships can be represented by either graded or quantal curves, with graded curves showing a continuous response to varying doses and quantal curves showing the proportion of subjects responding at different doses.
2. Key features of dose response curves include the median effective dose (ED50) which produces a 50% response, potency which is measured by the dose required for 50% effect, and the therapeutic index which is the ratio of toxic to effective doses.
3. Both curve types provide information about a drug's potency but graded curves also indicate maximum efficacy while quantal curves show variability in individual responses.
1. Receptors are cellular macromolecules that mediate chemical signaling between and within cells. They have affinity for ligands and intrinsic activity that triggers a pharmacological response upon ligand binding.
2. Agonists have high affinity and intrinsic activity, forming active receptor complexes. Antagonists have affinity but no intrinsic activity. Partial agonists and inverse agonists have intermediate effects.
3. There are several types of receptors including ion channels, G protein-coupled, kinase-linked, intracellular, and enzymes. Long-term receptor exposure can lead to down-regulation or up-regulation depending on if it's an agonist or antagonist.
The document summarizes different types of receptors and their classification. It discusses four main types of receptors: ligand gated ion channel receptors (inotropic), G-protein coupled receptors (metabotropic), kinase linked receptors, and nuclear receptors. It provides details about their molecular structure, signaling mechanisms, examples, and comparisons between receptor types. In summary, the document provides an overview of receptor pharmacology, classification of receptors, and their role in drug action and signaling pathways.
Deterministic sampling methods can be used to generate ensembles that represent modeling uncertainty in a more efficient and reproducible way than traditional Monte Carlo sampling. The document discusses applications of deterministic sampling in fields like dynamic metrology, medicine, meteorology and more. It also presents some specific deterministic sampling techniques like matched moments, sigma points, and sample annealing and discusses how these can be used for both direct uncertainty quantification and inverse problems like model identification.
Variability (noise) caused by random variation rather than true differences among individuals is an intrinsic feature of the biomedical world. Time series data from patients (in the case of clinical science) or number of infections (in the case of epidemics) can vary due to both intrinsic differences and incidental fluctuations. The use of traditional fitting methods for ODEs applied to real data sets implies that deviation from some trend is ascribed to error or parametric heterogeneity. Thus, noise can be wrongly classified as differences among individuals, leading to potentially erroneous predictions and misguided policies or research programs. We studied the ability of model fitting, under different hypotheses (fixed or random effects), to capture individual differences in the underlying data. We explore a simple (exactly solvable) example displaying an initial exponential growth by comparing state-of-the-art stochastic fitting and traditional least squares approximations. I discuss the implications of these results for the interpretation of biological data using as an example the 2014-2015 Ebola epidemic in Africa.
1. The document discusses implicit shape representations for liver segmentation from CT scans, comparing heat, signed distance, and Poisson transforms.
2. It evaluates these representations using principal component analysis to build a linear shape space model from training data.
3. Results show the Poisson transform provides the most stable and effective implicit representation for segmentation, outperforming other methods in experiments projecting new shapes into the learned shape space.
Although we often told not to do it, statistical scientists frequently predict the value of outcome measures of physical systems at input points far the observed data. Since predictions are made in new regions of the input space, a statistical theory cannot dictate optimal rules for measures of uncertainty associated with extrapolation. This talk presents several solutions based on simple principles. The solutions are illustrated via the analysis of data generated by dropping spheres of varying radii and masses from different heights. Some of the techniques apply to more complex physical systems. The efficacy of these techniques is demonstrated using data (experimental and simulated) of the level of complexity physical scientist frequently face. Scientists should tailor these techniques to fit the needs of a particular application.
Statistics is the science of dealing with numbers and data. It involves collecting, summarizing, presenting, and analyzing data. There are four main steps: data collection, summarization by removing unwanted data and classifying/tabulating, presentation with diagrams/graphs/tables, and analysis using measures like average, dispersion, and correlation. Descriptive statistics summarize and describe data, while inferential statistics allow generalizing from samples to populations. Common descriptive statistics include measures of central tendency (mean, median, mode), variability (range, variance, standard deviation), and distribution properties. Inferential statistics techniques like hypothesis testing and ANOVA are used to make inferences about populations based on samples.
In the presence of relevant physical observations, one can usually calibrate a computer model, and even estimate systematic discrepancies of the model from reality. Estimating and quantifying the uncertainty in this model discrepancy can lead to reliable predictions - so long as the prediction "is similar to" the available physical observations. Exactly how to define "similar" has proven difficult in many applications. Clearly it depends on how well the computational model captures the relevant physics in the system, as well as how portable the model discrepancy is in going from the available physical data to the prediction. This talk will discuss these concepts using computational models ranging from simple to very complex.
Vertical wind speed measurements from Doppler LIDAR were analyzed to characterize wind speed extremes. Gaussian process models were fit to capture the nonseparability of the spatial and temporal covariance structure. A spectral-in-time covariance function was developed that includes a frequency-dependent spatial coherence function. Fast fitting methods were used to approximate the likelihood for large datasets. Zero crossing statistics were also examined to analyze wind speed thresholds over time. Future work will include incorporating cyclostationary models and additional climate variables as covariates.
An Application of Uncertainty Quantification to MPMwallstedt
The document discusses uncertainty quantification (UQ) methods like Latin hypercube sampling that can be applied to material point method (MPM) simulations to characterize outputs given uncertain inputs. It provides examples of using UQ on an MPM cantilever beam model, finding new correlations between inputs like beam thickness and outputs like vibration frequency. Code mistakes were discovered and correcting them led to additional insights from re-running the UQ analysis.
This document discusses dynamics of structures with uncertainties. It begins with an introduction to stochastic single degree of freedom systems and how natural frequency variability can be modeled using probability distributions. It then discusses how to extend this approach to stochastic multi degree of freedom systems using stochastic finite element formulations and modal projections. Key challenges with statistical overlap of eigenvalues are noted. The document provides mathematical models of equivalent damping in stochastic systems and examples of stochastic frequency response functions.
These days a lot of data being generated is in the form of time series. From climate data to users post in social media, stock prices, neurological data etc. Discovering the temporal dependence between different time series data is important task in time series analysis. It finds its application in varied fields ranging from advertising in social media, finding influencers, marketing, share markets, psychology, climate science etc. Identifying the networks of dependencies has been studied in this report.
In this report we have study how this problem has been studied in the field of econometrics. We will also study three different approaches for building causal networks between the time series and then see how this knowledge has been used in three completely different fields. At last some important issues are presented and areas in which this can be extended for further research.
Spatio-Temporal Characterization with Wavelet Coherence: Anexus between Envir...ijsc
Identifying spatio-temporal synchrony in a complex, interacting and oscillatory coupled-system is a challenge. In particular, the characterization of statistical relationships between environmental or biophysical variables with the multivariate data of pandemic is a difficult process because of the intrinsic variability and non-stationary nature of the time-series in space and time. This paper presents a methodology to address these issues by examining the bivariate relationship between Covid-19 and temperature time-series in the time-localized frequency domain by using Singular Value Decomposition (SVD) and continuous cross-wavelet analysis. First, the dominant spatio-temporal trends are derived by using the eigen decomposition of SVD. The Covid-19 incidence data and the temperature data of the corresponding period are transformed into significant eigen-state vectors for each spatial unit. The Morlet Wavelet transformation is performed to analyse and compare the frequency structure of the dominant trends derived by the SVD. The result provides cross-wavelet transform and wavelet coherence measures in the ranges of time period for the corresponding spatial units. Additionally, wavelet power spectrum and paired wavelet coherence statistics and phase difference are estimated. The result suggests statistically significant coherency at various frequencies providing insight into spatio-temporal dynamics. Moreover, it provides information about the complex conjugate dynamic relationships in terms phases and phase
differences.
The document provides information about performing chi-square tests and choosing appropriate statistical tests. It discusses key concepts like the null hypothesis, degrees of freedom, and expected versus observed values. Examples are provided to illustrate chi-square tests for goodness of fit and comparison of proportions. The document also compares parametric and non-parametric tests, providing examples of when each would be used.
1. Drugs can interact with receptors, ion channels, enzymes, and carrier molecules in cells.
2. Receptor-mediated mechanisms involve drugs binding to receptors, forming drug-receptor complexes that trigger biological responses. Non-receptor mechanisms do not involve receptors.
3. There are different types of receptors and signal transduction pathways, including ionotropic receptors, G-protein coupled receptors, enzyme-linked receptors, and receptors regulating gene expression.
These formulations are used to treat dry eyes and contain polymers that adhere to the eye's mucus membrane. Bioadhesive polymers are able to prolong the release of drugs by binding to mucosal surfaces. The eye presents challenges for drug delivery due to its protective mechanisms, so bioadhesive formulations are designed to be comfortable and maintain drug concentrations in the eye. Examples given are Hypotears and Sno Tears eye drops, which contain polyvinyl alcohol to lubricate the eyes and increase tear production.
1. Dose response relationships can be represented by either graded or quantal curves, with graded curves showing a continuous response to varying doses and quantal curves showing the proportion of subjects responding at different doses.
2. Key features of dose response curves include the median effective dose (ED50) which produces a 50% response, potency which is measured by the dose required for 50% effect, and the therapeutic index which is the ratio of toxic to effective doses.
3. Both curve types provide information about a drug's potency but graded curves also indicate maximum efficacy while quantal curves show variability in individual responses.
1. Receptors are cellular macromolecules that mediate chemical signaling between and within cells. They have affinity for ligands and intrinsic activity that triggers a pharmacological response upon ligand binding.
2. Agonists have high affinity and intrinsic activity, forming active receptor complexes. Antagonists have affinity but no intrinsic activity. Partial agonists and inverse agonists have intermediate effects.
3. There are several types of receptors including ion channels, G protein-coupled, kinase-linked, intracellular, and enzymes. Long-term receptor exposure can lead to down-regulation or up-regulation depending on if it's an agonist or antagonist.
The document summarizes different types of receptors and their classification. It discusses four main types of receptors: ligand gated ion channel receptors (inotropic), G-protein coupled receptors (metabotropic), kinase linked receptors, and nuclear receptors. It provides details about their molecular structure, signaling mechanisms, examples, and comparisons between receptor types. In summary, the document provides an overview of receptor pharmacology, classification of receptors, and their role in drug action and signaling pathways.
Deterministic sampling methods can be used to generate ensembles that represent modeling uncertainty in a more efficient and reproducible way than traditional Monte Carlo sampling. The document discusses applications of deterministic sampling in fields like dynamic metrology, medicine, meteorology and more. It also presents some specific deterministic sampling techniques like matched moments, sigma points, and sample annealing and discusses how these can be used for both direct uncertainty quantification and inverse problems like model identification.
Variability (noise) caused by random variation rather than true differences among individuals is an intrinsic feature of the biomedical world. Time series data from patients (in the case of clinical science) or number of infections (in the case of epidemics) can vary due to both intrinsic differences and incidental fluctuations. The use of traditional fitting methods for ODEs applied to real data sets implies that deviation from some trend is ascribed to error or parametric heterogeneity. Thus, noise can be wrongly classified as differences among individuals, leading to potentially erroneous predictions and misguided policies or research programs. We studied the ability of model fitting, under different hypotheses (fixed or random effects), to capture individual differences in the underlying data. We explore a simple (exactly solvable) example displaying an initial exponential growth by comparing state-of-the-art stochastic fitting and traditional least squares approximations. I discuss the implications of these results for the interpretation of biological data using as an example the 2014-2015 Ebola epidemic in Africa.
1. The document discusses implicit shape representations for liver segmentation from CT scans, comparing heat, signed distance, and Poisson transforms.
2. It evaluates these representations using principal component analysis to build a linear shape space model from training data.
3. Results show the Poisson transform provides the most stable and effective implicit representation for segmentation, outperforming other methods in experiments projecting new shapes into the learned shape space.
Although we often told not to do it, statistical scientists frequently predict the value of outcome measures of physical systems at input points far the observed data. Since predictions are made in new regions of the input space, a statistical theory cannot dictate optimal rules for measures of uncertainty associated with extrapolation. This talk presents several solutions based on simple principles. The solutions are illustrated via the analysis of data generated by dropping spheres of varying radii and masses from different heights. Some of the techniques apply to more complex physical systems. The efficacy of these techniques is demonstrated using data (experimental and simulated) of the level of complexity physical scientist frequently face. Scientists should tailor these techniques to fit the needs of a particular application.
Statistics is the science of dealing with numbers and data. It involves collecting, summarizing, presenting, and analyzing data. There are four main steps: data collection, summarization by removing unwanted data and classifying/tabulating, presentation with diagrams/graphs/tables, and analysis using measures like average, dispersion, and correlation. Descriptive statistics summarize and describe data, while inferential statistics allow generalizing from samples to populations. Common descriptive statistics include measures of central tendency (mean, median, mode), variability (range, variance, standard deviation), and distribution properties. Inferential statistics techniques like hypothesis testing and ANOVA are used to make inferences about populations based on samples.
In the presence of relevant physical observations, one can usually calibrate a computer model, and even estimate systematic discrepancies of the model from reality. Estimating and quantifying the uncertainty in this model discrepancy can lead to reliable predictions - so long as the prediction "is similar to" the available physical observations. Exactly how to define "similar" has proven difficult in many applications. Clearly it depends on how well the computational model captures the relevant physics in the system, as well as how portable the model discrepancy is in going from the available physical data to the prediction. This talk will discuss these concepts using computational models ranging from simple to very complex.
Vertical wind speed measurements from Doppler LIDAR were analyzed to characterize wind speed extremes. Gaussian process models were fit to capture the nonseparability of the spatial and temporal covariance structure. A spectral-in-time covariance function was developed that includes a frequency-dependent spatial coherence function. Fast fitting methods were used to approximate the likelihood for large datasets. Zero crossing statistics were also examined to analyze wind speed thresholds over time. Future work will include incorporating cyclostationary models and additional climate variables as covariates.
An Application of Uncertainty Quantification to MPMwallstedt
The document discusses uncertainty quantification (UQ) methods like Latin hypercube sampling that can be applied to material point method (MPM) simulations to characterize outputs given uncertain inputs. It provides examples of using UQ on an MPM cantilever beam model, finding new correlations between inputs like beam thickness and outputs like vibration frequency. Code mistakes were discovered and correcting them led to additional insights from re-running the UQ analysis.
This document discusses dynamics of structures with uncertainties. It begins with an introduction to stochastic single degree of freedom systems and how natural frequency variability can be modeled using probability distributions. It then discusses how to extend this approach to stochastic multi degree of freedom systems using stochastic finite element formulations and modal projections. Key challenges with statistical overlap of eigenvalues are noted. The document provides mathematical models of equivalent damping in stochastic systems and examples of stochastic frequency response functions.
These days a lot of data being generated is in the form of time series. From climate data to users post in social media, stock prices, neurological data etc. Discovering the temporal dependence between different time series data is important task in time series analysis. It finds its application in varied fields ranging from advertising in social media, finding influencers, marketing, share markets, psychology, climate science etc. Identifying the networks of dependencies has been studied in this report.
In this report we have study how this problem has been studied in the field of econometrics. We will also study three different approaches for building causal networks between the time series and then see how this knowledge has been used in three completely different fields. At last some important issues are presented and areas in which this can be extended for further research.
Spatio-Temporal Characterization with Wavelet Coherence: Anexus between Envir...ijsc
Identifying spatio-temporal synchrony in a complex, interacting and oscillatory coupled-system is a challenge. In particular, the characterization of statistical relationships between environmental or biophysical variables with the multivariate data of pandemic is a difficult process because of the intrinsic variability and non-stationary nature of the time-series in space and time. This paper presents a methodology to address these issues by examining the bivariate relationship between Covid-19 and temperature time-series in the time-localized frequency domain by using Singular Value Decomposition (SVD) and continuous cross-wavelet analysis. First, the dominant spatio-temporal trends are derived by using the eigen decomposition of SVD. The Covid-19 incidence data and the temperature data of the corresponding period are transformed into significant eigen-state vectors for each spatial unit. The Morlet Wavelet transformation is performed to analyse and compare the frequency structure of the dominant trends derived by the SVD. The result provides cross-wavelet transform and wavelet coherence measures in the ranges of time period for the corresponding spatial units. Additionally, wavelet power spectrum and paired wavelet coherence statistics and phase difference are estimated. The result suggests statistically significant coherency at various frequencies providing insight into spatio-temporal dynamics. Moreover, it provides information about the complex conjugate dynamic relationships in terms phases and phase
differences.
The document provides information about performing chi-square tests and choosing appropriate statistical tests. It discusses key concepts like the null hypothesis, degrees of freedom, and expected versus observed values. Examples are provided to illustrate chi-square tests for goodness of fit and comparison of proportions. The document also compares parametric and non-parametric tests, providing examples of when each would be used.
The document discusses research on learning to improve the efficiency of machine learning algorithms through speedup learning. It provides three key points:
1) Early work on explanation-based learning for speedup had limited success, but techniques like memoization and clause learning led to major improvements in SAT solvers.
2) More recent approaches use machine learning to build predictive models of problem instances and solver behavior, in order to inform strategies like automatic noise setting and randomized restart policies.
3) Case studies demonstrate these learning-based approaches can outperform traditional techniques and fixed policies by customizing resource allocation and reformulation based on problem structure and solver progress.
The document discusses research on learning to improve the efficiency of machine learning algorithms through speedup learning. It provides three key points:
1) Early work on explanation-based learning for speedup had limited success, but techniques like memoization and clause learning led to major improvements in SAT solvers.
2) More recent approaches use predictive models trained on dynamic features to learn optimal policies for controlling search algorithms, like setting noise levels or restart policies.
3) Open problems remain in developing optimal predictive policies with partial information and approximations, to continue improving search and reasoning performance.
Chapter 4
Summarizing Data Collected in the Sample
Learning Objectives (1 of 3)Distinguish between dichotomous, ordinal, categorical, and continuous variablesIdentify appropriate numerical and graphical summaries for each variable typeCompute a mean, median, standard deviation, quartiles and range for a continuous variable
Learning Objectives (2 of 3)Construct a frequency distribution table for dichotomous, categorical, and ordinal variablesProvide an example of when the mean is a better measure of location than the medianInterpret the standard deviation of a continuous variable
Learning Objectives (3 of 3)Generate and interpret a box plot for a continuous variableProduce and interpret side-by-side box plotsDifferentiate between a histogram and a bar chart
Variable TypesDichotomous variables have two possible responses (e.g., yes/no).Ordinal and categorical variables have more than two responses, and responses are ordered and unordered, respectively.Continuous (or measurement) variables assume in theory any values between a theoretical minimum and maximum.
BiostatisticsTwo areas of applied biostatisticsDescriptive statistics—summarize a sample selected from a population Inferential statistics—make inferences about population parameters based on sample statistics.
VocabularyData elements/data points Subjects/units of measurementPopulation versus sample
Sample vs. Population Any summary measure computed on a sample is a statistic.Any summary measure computed on a population is a parameter.
n = Sample Size
N = Population Size
Example 4.1.
Dichotomous Variable
Frequency Distribution Table
Relative Frequency Bar Chart for Dichotomous Variable
Sample: n = 50
Population: Patients at health center
Variable: Marital status
Categorical Outcome (1 of 2)Marital StatusNumber of PatientsMarried24Separated5Divorced8Widowed2Never married11Total50
Categorical Outcome (2 of 2)
Frequency Distribution Table Marital StatusNumber of
Patients (f)Relative Frequency
(f/n)Married240.48Separated50.10Divorced80.16Widowed20.04Never married110.22Total501.00
Frequency Bar Chart
Sample: n =50
Population: Patients at health center
Variable: Self-reported current health status
Ordinal Outcome (1 of 2)Health StatusNumber of PatientsExcellent19Very good12Good9Fair6Poor4Total50
Ordinal Outcome (2 of 2)
Frequency Distribution Table Heath StatusFreq.Rel. Freq.Cumulative Freq.Cumulative Rel. Freq.Excellent1938%1938%Very good1224%3162%Good918%4080%Fair612%4692%Poor48%50100%50100%
Relative Frequency Histogram
Example 4.2.
Ordinal Variable
Frequency Distribution Table
Relative Frequency Histogram
for Ordinal Variable
Assume, in theory, any value between a theoretical minimum and maximumQuantitative, measurement variables
Continuous Variable (1 of 9)
Population: Patients 50 years of age with coronary artery diseaseSample: n = 7 patientsOutcome: Systol ...
Strategies for Metabolomics Data AnalysisDmitry Grapov
Part of a lectures series for the international summer course in metabolomics 2013 (http://metabolomics.ucdavis.edu/courses-and-seminars/courses). Get more material and information here (http://imdevsoftware.wordpress.com/2013/09/08/sessions-in-metabolomics-2013/).
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
2. INTRODUCTION
Motivation
High variability ADHD
observed oscillatory component specific to ADHD
(Castellanos et al., 2005)
understanding underlying process
RT variability » Introduction
Leth-Steensen et al. (2000) Acta Psych., 104,167-190 Castellanos et al. (2005) Biol. Psychiatry, 57, 1416–1423
3. VARIABILITY RESULTS
Some results in neuropsychology literature
Number of studies report increased intra individual
variability (IIV) in RT on cognitive tasks in ADHD
More extreem slow responses in ADHD (Leth-
Steensen et al., 2000;Williams et al., 2007)
IIV potential ‘endopheno type’ (Castellanos &
Tannock, 2002)
RT variability » Introduction
Castellanos & Tannock (2002) Nat. Rev. Neurosc., 3, 617–628
4. VARIABILITY RESULTS
Some results in neuropsychology literature
Autism associated with increased IIV (Geurts et al.,
2004)
Details in findings differ between studies, different
methods used (Geurts et al., 2008)
Not accounted for linear relationship between RT
mean and RT standard deviation (Luce, 1986;
Wagenmakers et al., 2005; 2006)
RT variability » Introduction
Luce (1986) ResponseTimes;Wagenmakers, et al (2005) JMP, 49, 195-204;Wagenmakers & Brown (2007) Psych. Rev., 114, 830–841
10. VARIANCE ANALYSIS
Not ANOVA (just to make sure)
Assessment of between group differences in intra-
individual variability (IIV)
looking at intra-individual standard deviations (ISD)
RT variability » Intra-individual variance
Williams et al. (2005) Neuropsychologia, 19, 88–96;Williams et al. (2007) JCENP, 29, 277-289
11. Response time histograms
RT
Density
500 1000 1500
0.00000.00050.00100.00150.00200.0025
TD Controls
HFA
Example intra-
individual variances
RT histograms HFA
vs.TD controls
TD children more
skewed
Clearly different
RT variability » Intra-individual variance » Example
.
VARIANCE ANALYSIS
12. Log transformed RT histograms
RT
Density
5.0 5.5 6.0 6.5 7.0 7.5
0.00.20.40.60.81.01.2
TD Controls
HFA
Example intra-
individual variances
Different means
t-test (transformed
and untransformed)
highly significant
RT variability » Intra-individual variance » Example
Williams et al. (2005) Neuropsych., 19, 88–96
VARIANCE ANALYSIS
13. Log transformed VRT histograms
RT
Density
4.0 4.5 5.0 5.5
0.00.51.01.5
TD Controls
HFA
Example intra-
individual variances
Airplane data
Different ISD’s?
Rather obvious,
confirmed by t-test
(t=-4.1, df=52.7, p
< .001)
RT variability » Intra-individual variance » Example
.
VARIANCE ANALYSIS
14. VARIANCE ANALYSIS
What is the origin of larger variability?
May be in upper tail or lower tail
RT variability » Intra-individual variance
Williams et al. (2007) JCENP, 29, 277-289
15. VARIANCE ANALYSIS
What is the origin of larger variability?
May be in upper tail or lower tail
Williams et al. (2007) compare ISDs of 25% highest
RTs and of 25% lowest RTs
HFA > TDC for upper 25% (p < .002, eff. size =.63)
HFA not different from TDC for lower 25% ISDs
RT variability » Intra-individual variance
Williams et al. (2007) JCENP, 29, 277-289
17. MODELING
fit informative density function to RTs histogram
a. k. a. Histrogram fitting
Parametric density functions proposed for RTs
Wald, Log-Normal, Ex-Wald, Ex-Gauss, Gamma,
Weibull, etc.
All long tailed, all associated with waiting times
RT variability » Histogram Modeling » Introduction
Luce (1986) ResponseTimes; van Zandt et al. (2000) PB&R, 7, 424-265; Heathcote et al. (1991) Psych. Bull., 109, 340–347
18. MODELING
Ex-Gauss (McGill, 1963)
Ex-Gauss r.v. is sum of two independent random
variables (density is convolution)
RT = D + T, D ~ N(μ,σ2), T ~ Exp(λ)
parameters: μ, σ2, λ (sometimes μ, σ2, τ =1/λ)
λ or τ measures ‘fatness’ of the upper tail
RT variability » Histogram Modeling » Ex-Gauss
McGill (1963) Handbook of Math. Psych., Vol.1, pp. 309-360; Heathcote et al. (1991)
19. Histogram of Ex−Gauss simulated RTs
RT
Density
200 400 600 800
0.0000.0010.0020.0030.0040.0050.006
MODELING
Example simulated RTs
with Ex-Gauss
μ = 304
σ = 34 (σ2 = 1156)
τ = 83
Often very good for
RTs
RT variability » Histogram Modeling » Ex-Gauss » Example
20. MODELING
How to fit this to data? How to get estimates?
Method of Moments
Maximum Likelihood
Chi-square fitting, quantiles fitting, CDF fitting,
‘Quantile Maximum Likelihood’, Bayesian, and more
Which to choose?
RT variability » Histogram Modeling » Estimation
21. MODELING
Method of Moments
Based on sample moments
Assumes RTs of different trials are independent
Often easy to calculate
Notoriously unstable if , can result in negative estimates!
E.g., for Ex-Gauss the parameter estimates are
τ = (m3/2)1/3 μ = m1-τ σ2 = m2 -τ2
RT variability » Histogram Modeling » Estimation » Method of Moments
mk = 1
n
n
i=1(xi − x)k
k > 2
22. MODELING
Maximum likelihood
Based on likelihood of data under different parameter values
Assumes RTs from different trials are independent
Many optimal properties (e.g., most precise)
Often not so easy to determine (computer work!)
Can be unstable (outliers, landscape, numerically)
RT variability » Histogram Modeling » Estimation » Maximum Likelihood
23. MODELING
Empirical quantiles based methods, e.g.:
Chi-square
QMLE, QMPE
More stable (outliers, numerically)
Inefficient (compared to ML)
Little statistical theory (can we rely on it?)
RT variability » Histogram Modeling » Estimation » Empirical quantiles based
Brown & Heathcote (2003) QMLE, BRMI&C,35, 485–492; Heathcote et al. (2004) QMPE, BRMI&C, 36, 277–290
24. MODELING
Fitting RT distributions can be done in ‘programmable
environments’
R/S-PLUS, Matlab, Excel, Calc, Gauss,WinBUGS, Maple, etc
SPSS more problematic
But: Ex-Gauss never implemented (nor Wald, Ex-Wald, diffusion)
Special purpose programs made available on internet
DISFIT, qmpe, RTSYS, PASTIS
RT variability » Histogram Modeling » Estimation » Software
Dolan et al. (2002), Heathcote et al. (2003), Heathcote (1996), Cousineau et al. (1997)
25. Histogram of Squid data (1 subject)
RT
Density
200 400 600 800 1000 1200
0.0000.0020.004
MODELING
Example fit to Squid
task data
How do we know it’s
‘correct’?
RT variability » Histogram Modeling » Estimation » Example
26. MODELING
Assessment of fit (apart from match with histogram)
Socalled ‘chi-square’ method sometimes in Ψ
literature (e.g., van Zandt et al., 2000)
Sometimes Kolmogorov-Smirnov test suggested
But: parameters are estimated
Bootstrap of e.g., KS-statistic (Babu & Rao, 2004)
Babu & Rao (2004) Sankyā, 66, 63–74
RT variability » Histogram Modeling » Estimation » Diagnostics
27. MODELING
Assumptions (independent identically distributed)
met?
Difficult to test for in general
Test based on E(XnYm) = E(Xn)E(Ym) X Y
In particular E(XY) = E(X)E(Y) Cov(X,Y) = 0
Most likely sequential correlations autocorrelation
.
RT variability » Histogram Modeling » Estimation » Diagnostics
28. 0 5 10 15
−0.20.20.61.0
Lag
ACF
Airplane Data . subject 20088 (HFA group)
0 5 10 15 20
−0.20.20.61.0
Lag
ACF
Squid Data . subject 40002 (group 1)
MODELING
Autocorrelation plot
Statistical test: (Box-
Pierce), Ljung-Box
Airplane data
χ²= 8.76, df=1, p=.003
Squid data
χ²= 1.02, df=1, p=.31
Ljung & Box, Biometrika, 65, 297–303
RT variability » Histogram Modeling » Diagnostics » Example
29. METHODS DISCUSSED
1.Variance analysis (not ANOVA)
2.RT histograms: fitting models that parametrise
distributional features (Ex-Gaussian in particular)
3.Sequence characteristics
trends,Time series, autocorrelation & spectral
analysis
4.Modeling cognitive processes
.
RT variability » Time series
30. TIME SERIES
Autocorrelation shows
sequential order
matters
RT sequential effects
(e.g. post error
slowing)
design induced
spontaneous
Luce (1986) ResponseTimes
RT variability » Time series
400600800
117
300400500600
118
300400500600700
119
4006008001000
0 10 20 30 40 50 60
120
Trial
250350450550
56
300500700
57
200300400500
58
250350450
0 10 20 30 40 50 60
59
Trial
Airplane − HFA & TD Controls
31. TIME SERIES
How to describe it?
All random but group
differences
Time series (and the
art of analysis)
Brillinger (1975) Time series: Data analysis and theory, Brockwell & Davis (1991) Time series:Theory and methods
RT variability » Time series
Airplane − HFA & TD Controls
Trial
0 10 20 30 40 50 60
20040060080010001200
HFA
TDC
32. TIME SERIES
Time series descriptives
Trends
Seasonal
Irregular fluctuations
Stationary time series
Non stationary time series
Brillinger (1975) Time series: Data analysis and theory, Brockwell & Davis (1991) Time series:Theory and methods
RT variability » Time series
33. TIME SERIES
Time series descriptives
Trends
Seasonal
Irregular fluctuations
Stationary time series
Non stationary time series
Brillinger (1975) Time series: Data analysis and theory, Brockwell & Davis (1991) Time series:Theory and methods
RT variability » Time series
34. TIME SERIES
Brillinger (1975) Time series: Data analysis and theory, Brockwell & Davis (1991) Time series:Theory and methods
RT variability » Time series
Time series descriptives
Trends
Seasonal
Irregular fluctuations
Stationary time series
Non stationary time series (detrending, differencing)
35. TIME SERIES
(Weakly) Stationary time series
E(Yt) is constant
E(YtYt-k) is function of k only
Characterization: Autocovariance function
E(YtYt-k) - E(Yt)2 or Power spectral density
Brillinger (1975) Time series: Data analysis and theory, Brockwell & Davis (1991) Time series:Theory and methods
RT variability » Time series » Autocovariance function
36. TIME SERIES
Autocorrelation
functions Airplane data
Differs from one
individual to another
Chatfield (1996) The Analysis of Time Series
RT variability » Time series » Autocovariance function » Example
0 5 10 15
−0.20.20.61.0
Lag
117(HFA)
0 5 10 15
−0.20.20.61.0
Lag
118(HFA)
Series bla[, i]
0 5 10 15
−0.20.20.61.0 Lag
119(HFA)
Series bla[, i]
0 5 10 15
−0.20.20.61.0
120(HFA)
Series bla[, i]
0 5 10 15
−0.20.20.61.0
Lag
56(TDC)
0 5 10 15
−0.20.20.61.0
Lag
57(TDC)
Series bla[, i]
0 5 10 15
−0.20.20.61.0
Lag
58(TDC)
Series bla[, i]
0 5 10 15
−0.20.20.61.0
59(TDC)
Series bla[, i]
37. TIME SERIES
Autocorrelation
functions Airplane data
Mean autocorrelation
functions: no obvious
difference
Geurts et al. (2008)
0 5 10 15
0.00.20.40.60.81.0
Lag
ACF
Airplane data - mean acf HFA group
0 5 10 15
0.00.20.40.60.81.0
ACF
Airplane data - mean acf TDC group
RT variability » Time series » Autocovariance function » Example
38. TIME SERIES
How to interpret this
correlation pattern?
This case ‘easy’: Large
RT on one trial
predicts large RT on
next (but R2 < 0.03)
Alternative: oscillatory
components
Chatfield (1996)
0 5 10 15
0.00.20.40.60.81.0
Lag
ACF
Airplane data - mean acf HFA group
0 5 10 15
0.00.20.40.60.81.0
ACF
Airplane data - mean acf TDC group
RT variability » Time series » Autocovariance function » Example
39. METHODS DISCUSSED
1.Variance analysis (not ANOVA)
2.RT histograms: fitting models that parametrise
distributional features (Ex-Gaussian in particular)
3.Sequence characteristics
trends,Time series, autocorrelation & spectral
analysis
4.Modeling cognitive processes
.
RT variability » Time series
40. SPECTRAL ANALYSIS
heuristic:‘regression’
of yt, t = 0, …,T-1, on
cosines and sines
yt = ∑ αk cos(kωTt) +
∑ βk sin(kωTt), k=1...K
ωT chosen so that α̂k
and α̂m will be
independent if k ≠ m
Brillinger (1975) Time series: data analysis and theory, Chatfield (1996)
RT variability » Time series » Spectral analysis
41. SPECTRAL ANALYSIS
not ‘regression’ in true
sense if K=T/2:
estimated αk, βk,
k=1...K are 2K=T coef.
transformation ! (DFT)
we always take K=T/2
Chatfield (1996) The analysis of time series
RT variability » Time series » Spectral analysis
42. SPECTRAL ANALYSIS
assumptions:
samples regularly
spaced
highest frequency < 2x
sampling frequency
orthogonal to time
(first can be relaxed)
Chatfield (1996) The analysis of time series
RT variability » Time series » Spectral analysis
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
53. SPECTRAL ANALYSIS
Solution is some form
of averaging Rk
2’s of
consecutive cycles (i.e.
periodogram smoothing)
at expense of bias
Further problem:
leakage due to finite T
(alleviated by tapering)
Brillinger (1975), Chatfield (1996)
RT variability » Time series » Spectral analysis
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
1 2 3 4 5 6 7 8 9 10
cycles
Rk
54. Smoothed
periodogram for HFA
and TDC individuals
7 consecutive cycle
average
Chatfield (1996)
0.0 0.2 0.4
060000
frequency
117(HFA)
bandwidth = 0.0127
0.0 0.2 0.4
5000
frequency
118(HFA)
bandwidth = 0.0127
0.0 0.2 0.4
500025000
frequency
119(HFA)
bandwidth = 0.0127
0.0 0.2 0.4
1000060000
frequency
120(HFA)
0.0 0.2 0.4
20006000
frequency
56(TDC)
bandwidth = 0.0127
0.0 0.2 0.4
1000040000
frequency
57(TDC)
bandwidth = 0.0127
0.0 0.2 0.4
500020000
frequency
58(TDC)
bandwidth = 0.0127
0.0 0.2 0.4
200010000
frequency
59(TDC)
RT variability » Time series » Spectral analysis » Example
SPECTRAL ANALYSIS
55. Smoothed
periodogram for HFA
and TDC individuals
15 consecutive cycle
average
Chatfield (1996)
0.0 0.2 0.4
1000035000
frequency
117(HFA)
bandwidth = 0.0300
0.0 0.2 0.4
200010000
frequency
118(HFA)
bandwidth = 0.0300
0.0 0.2 0.4
800014000
frequency
119(HFA)
bandwidth = 0.0300
0.0 0.2 0.4
1000040000
frequency
120(HFA)
0.0 0.2 0.4
25004500
frequency
56(TDC)
bandwidth = 0.0300
0.0 0.2 0.4
500020000
frequency
57(TDC)
bandwidth = 0.0300
0.0 0.2 0.4
200012000
frequency
58(TDC)
bandwidth = 0.0300
0.0 0.2 0.4
30007000
frequency
59(TDC)
RT variability » Time series » Spectral analysis » Example
SPECTRAL ANALYSIS
56. Smoothed
periodogram for HFA
and TDC individuals
31 consecutive cycle
average
gross features change
→ perhaps too much
smoothing
Chatfield (1996)
0.0 0.2 0.4
1500035000
frequency
117(HFA)
bandwidth = 0.0679
0.0 0.2 0.4
30006000
frequency
118(HFA)
bandwidth = 0.0679
0.0 0.2 0.4
900012000
frequency
119(HFA)
bandwidth = 0.0679
0.0 0.2 0.4
1500035000
frequency
120(HFA)
0.0 0.2 0.4
30004000
frequency
56(TDC)
bandwidth = 0.0679
0.0 0.2 0.4
500020000
frequency
57(TDC)
bandwidth = 0.0679
0.0 0.2 0.4
400012000
frequency
58(TDC)
bandwidth = 0.0679
0.0 0.2 0.4
40007000
frequency
59(TDC)
RT variability » Time series » Spectral analysis » Example
SPECTRAL ANALYSIS
57. Group inference:
Average smoothed
periodograms
Assumption: within
groups spectra are the
same (can be relaxed)
Chatfield (1996)
0.0 0.1 0.2 0.3 0.4 0.5
20000300004000050000600007000080000
Mean spectrum HFA and TD Control groups
frequency
spectrum
HFA
TDC
RT variability » Time series » Spectral analysis » Example
SPECTRAL ANALYSIS
58. SPECTRAL ANALYSIS
What about window carpentry?
Can help to reduce ‘spectral leakage’; is art
however, most software use cosine bell 10% tapers
What about ‘zero padding’?
only done for computational speed, suggests
increase spectral resolution (is not the case!)
Sometimes remarkable preprocessing (escapes me)
Johnson et al. (2007); Castellanos et al. (2005)
RT variability » Time series » Spectral analysis
59. Press et al. (2002) Numerical Recipes
0.0 0.1 0.2 0.3 0.4 0.5
050001000015000
frequency
spectrum
Smoothed Periodogram
Zero padded sequence to length 256
bandwidth = 0.00298
RT variability » Time series » Spectral analysis » Example
SPECTRAL ANALYSIS
0.0 0.1 0.2 0.3 0.4 0.5
010000200003000040000
frequency
spectrum
Smoothed Periodogram
original RT sequence (60 trials)
bandwidth = 0.0129
60. SPECTRAL ANALYSIS
But ... were all assumptions satisfied?
RT sequences stationary? only by eye balling (Dickey-
Fuller test)
samples regularly spaced? Airplane data:Yes
Sampling frequency > 2x highest frequency? Eh...?
RT measurements orthogonal to time? Nope...
Brillinger (1975)
RT variability » Time series » Spectral analysis
61. SPECTRAL ANALYSIS
non stationary RT sequences? detrending,
differencing, STFT, wavelets, Haar-transform (?)
irregular spaced samples? adapted DFT, splines, Haar-
transform (?)
Sampling frequency < 2x highest frequency? Spectral
folding may be minor problem, focus on trend, Haar(?)
RTs not orthogonal to time? point process, Haar-
transform (?)
Koopman (1995),Torrence & Compo (1998) Bul. Meteor. Soc. , 79, 61–78
RT variability » Time series » Spectral analysis
62. METHODS DISCUSSED
1.Variance analysis (not ANOVA)
2.RT histograms: fitting models that parametrise
distributional features (Ex-Gaussian in particular)
3.Sequence characteristics
trends,Time series, autocorrelation & spectral
analysis
4.Modeling cognitive processes
.
RT variability » Time series
63. TIME SERIES MODELS
General linear stationary time series model:
yt = zt + θ1 zt-1 + θ2 zt-2 + θ3 zt-3 + θ4 zt-4 + ⋅⋅⋅
zt, zt-1, zt-2, ..., independent and identically distributed
with variance σ2
that is, the series is “filtered noise”
Very broad class of models, and in most useful cases
equivalently expressed in finite numbers of
parameters
Box & Jenkins (1970) Time Series: Forcasting and control, Brockwell & Davis, Time series theories & methods
RT variability » Time series » Modeling
64. TIME SERIES MODELS
moving average MA(q)
yt = zt + θ1zt-1 + θ2zt-2 + ⋅⋅⋅ + θqzt-q = zt + ∑θizt-i
autoregressive AR(p)
yt = ϕ1yt-1 + ϕ2yt-2 +⋅⋅⋅+ ϕpyt-p + zt = ∑ϕp yt-p + zt
ARMA(p,q)
yt - ∑ϕp yt-p = zt + ∑θizt-i
Box & Jenkins (1970) Time Series: Forcasting and control; Hamaker, Dolan & Molenaar, MBR, 40, 207–233
RT variability » Time series » Modeling
65. Chatfield (1996)
MA((1)) θθ1 == 0.7
Time
yt
0 20 40 60 80 100
−3−2−10123
AR((1)) φφ1 == 0.7
Time
yt
0 20 40 60 80 100
−2−10123
0 5 10 15 20
−0.20.00.20.40.60.81.0
Lag
ACF
MA((1)) θθ1 == 0.7
0 5 10 15 20
−0.20.00.20.40.60.81.0
Lag
ACF
AR((1)) φφ1 == 0.7
TIME SERIES MODELS
RT variability » Time series » Modeling » Example
66. Chatfield (1996)
MA((1)) θθ1 == 0.7
Time
yt
0 20 40 60 80 100
−3−2−10123
AR((1)) φφ1 == 0.7
Time
yt
0 20 40 60 80 100
−2−10123
0.0 0.1 0.2 0.3 0.4 0.5
0.00.51.01.52.02.53.0
frequency
spectrum
MA((1)) θθ1 == 0.7
bandwidth = 0.0180
smoothed periodogram
theoretical spectrum
0.0 0.1 0.2 0.3 0.4 0.5
0246810
frequency
spectrum
AR((1)) φφ1 == 0.7
bandwidth = 0.0180
smoothed periodogram
theoretical spectrum
TIME SERIES MODELS
RT variability » Time series » Modeling » Example
67. TIME SERIES MODELS
Autoregressive AR(p) can be cast into MA(∞) form
yt = ϕ yt-1 + zt = ϕ (ϕ yt-2 + zt-1) + zt
= ϕ2yt-2 + ϕzt-1 + zt = etc.
= zt + ϕzt-1 + ϕ2zt-2 + ϕ3zt-3 +⋅⋅⋅
requires convergence for stationarity, i.e., |ϕ|<1
More generally, the (complex) solutions for B of
1 - ϕ1B + ϕ2B2 +⋅⋅⋅+ ϕpBp = 0 must satisfy |B|>1
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling
68. TIME SERIES MODELS
While not strictly necessary, for technical reasons
MA(q) coefficients are usually restricted such that it
can be cast as an AR(∞) (MA is then called invertible)
ARMA(p, q) satisfying these conditions is stationary
and can be cast into AR(∞) or MA(∞) form
Therefore ARMA(p,q) suits most observed stationary
time series
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling
69. Chatfield (1996)
ARMA((1,, 1)) φφ1 == 0.7 θθ1 == 0.7
Time
arima.sim(list(ar=0.7,ma=0.7),100)
0 20 40 60 80 100
−6−4−202
0 5 10 15 20
−0.40.00.20.40.60.81.0
Lag
ACF
ARMA((1,, 1)) φφ1 == 0.7 θθ1 == 0.7
0.0 0.1 0.2 0.3 0.4 0.5
051015202530
frequency
spectrum
ARMA((1,, 1)) φφ1 == 0.7 θθ1 == 0.7
bandwidth = 0.00764
TIME SERIES MODELS
RT variability » Time series » Modeling » Example
70. TIME SERIES MODELS
How to fit an ARMA(p, q)?
Fitting time series models is an art, but necessary
steps:
choose model (i.e., specify p and q)
fit using least squares or ML (optimization with or
without Kalman filter), many programs: R, SPSS, etc.
evaluate model fit
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling
71. HFA individual
Time
RT
0 10 20 30 40 50 60
300350400450500550600
TIME SERIES MODELS
Choosing a model
(i.e., p and q)
For q some guidance
from ACF
For p some guidance
from partial
autocorrelation
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
72. 0 5 10 15
−0.20.00.20.40.60.81.0
Lag
ACF
HFA individual
TIME SERIES MODELS
Choosing a model
(i.e., p and q)
For q some guidance
from ACF → 0, 1, 2?
For p some guidance
from partial
autocorrelation
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
73. 5 10 15
−0.3−0.2−0.10.00.10.2
Lag
PartialACF
HFA individual
TIME SERIES MODELS
Choosing a model
(i.e., p and q)
For q some guidance
from ACF → 0, 1?
For p some guidance
from partial
autocorrelation → 2
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
74. 0.0 0.1 0.2 0.3 0.4 0.5
20004000600080001000012000
frequency
spectrum
HFA individual
bandwidth = 0.0300
TIME SERIES MODELS
Fit AR(2)
AR(2) fit
ϕ1 = 0.03,
ϕ2 = -0.33
theoretical and
smoothed
periodogram
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
75. Standardized Residuals
Time
0 10 20 30 40 50 60
−2−1012
0 5 10 15
−0.20.20.61.0
Lag
ACF
ACF of Residuals
●
●
●
● ● ●
●
●
●
●
2 4 6 8 10
0.00.40.8
p values for Ljung−Box statistic
lag
pvalue
TIME SERIES MODELS
Evaluate model fit
Analyse residuals
standardized
residuals,ACF
residuals, Ljung-Box
test
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
76. 0.0 0.1 0.2 0.3 0.4 0.5
050001000015000200002500030000
frequency
spectrum
HFA individual
bandwidth = 0.0300
TIME SERIES MODELS
Modern approach: use
fit indices (AIC, BIC,
etc) to select model
AIC yields ARMA(2,1)
for differenced series:
ϕ1 = 0.02,
ϕ2 = -0.41,
θ1 = -0.91
due to slight trend?
Brockwell & Davis (1991), Chatfield (1995)
RT variability » Time series » Modeling » Example
77. TIME SERIES MODELS
Most fundamental:ARMA(p, q)
Generalizations:ARIMA(p, d, q),ARFIMA, NARMA, etc
Others: (G)ARCH & family,TAR, MS, point processes
Warning: models are developed for forecasting not
necessarily interpretation (see however Hamaker &
Dolan)
Box & Jenkins (1970), Brockwell & Davis (1991), Chatfield (1996); Hamaker & Dolan (in press)
RT variability » Time series » Modeling
78. TIME SERIES MODELS
Assumptions in time series models are basically the
same as in spectral analysis:
stationarity
regular sample spacing (can be relaxed, but difficult)
orthogonal to time axis (point process more
natural)
.
RT variability » Time series » Modeling
79. TIME SERIES MODELS
Both spectral analysis and time series models assume
measurements orthogonal to time axis
RTs are not, they are parallel.What can be done?
Come up with a model for how RTs are generated
One possibility: Postulate a latent process ξt that
determines RTs:
RTt ~ fRT(ξt), e.g., E(RTt) = ξt, requires analysis
.
RT variability » Time series
81. PROCESS UNDERLYING RT
Can we understand how RT distributions come
about?
Mathematical psychology: modeling information
processing in the brain
Models should predict RT characteristics (moments,
histograms, serial correlation, etc., and correctness)
.
RT variability » Process modeling
82. PROCESS UNDERLYING RT
Ex-Gauss purely descriptive
Most important models
Random walks models (discrete time)
Diffusion models (continuous time)
Race models
Luce (1986) Responce times
RT variability » Process modeling
83. PROCESS UNDERLYING RT
Ratcliff’s diffusion model
‘Special purpose model’ for 2AFC tasks
Smith (1960), Ratcliff (1978)
RT variability » Process modeling » Diffusion models
84. Two alternative response time tasks are usually
analyzed by considering mean response time (MRT)
and %error separately
Speed-accuracy trade-off
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
85. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Ratcliff (1978) Psych Rev
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
86. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Ratcliff (1978) Psych Rev
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
87. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
88. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Boundary
Correct
response
Error
response
Ratcliff (1978) Psych Rev
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
89. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Starting
point
Correct
response
Error
response
Ratcliff (1978) Psych Rev
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
90. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
Drift
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
91. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
Non-
decision
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
92. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
:= ν + N(0, η2
)
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
93. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
:= ν + N(0, η2
):= z + Unif(−ρz, ρz)
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
94. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
:= ν + N(0, η2
)
:= τer + Unif(0, ρτ )
:= z + Unif(−ρz, ρz)
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
95. X(t)
0
z
a
0 340 680 1020 1360 1700
Diffusion model
Time (msec)
{
!er
"
Information accumulation as continuous random walk
Correct
response
Error
response
Ratcliff (1978) Psych Rev
RT = Exit Time + τer
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
97. Parameter estimation by
Maximum Likelihood
Weighted Least Squares
Chi-square estimation
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
98. Parameter estimation by
Maximum Likelihood
Weighted Least Squares
Chi-square estimation
Difficulties with estimation
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
99. Parameter estimation by
Maximum Likelihood
Weighted Least Squares
Chi-square estimation
Difficulties with estimation
Long computation times
Evaluation of density involves numerical integration
PROCESS UNDERLYING RT
RT variability » Process modeling » Diffusion models
100. EZ-DIFFUSION ESSENTIALS
Chisquare: FastDM (Voss et al.),
ML, QML: DMat (MATLABVandekerckhove et al.)
Method of Moments estimator simplified models
EZ diffusion model estimates (Wagenmaker et al.)
EZ2 diffusion model estimates (Grasman et al.)
Vandekerkhove et al. (2007) BRM;Voss et al., (2007) BRM;Wagenmakers et al. (2007) PB&R, 14, 3-22, Grasman (2008) JMP, accepted