RidgeRace is a new method for ancestral reconstruction of continuous traits that uses ridge regression on a phylogenetic tree. It estimates ancestral traits without assuming a specific model of evolution and allows evolutionary rates to vary across branches. Simulations showed RidgeRace performs comparably to existing methods like maximum likelihood while being simpler. It was applied to ovarian cancer clustering but choice of real data was limited since RidgeRace works best on balanced trees.
Alignment of Ontology Design Patterns: Class As Property Value, Value Partiti...Bene Rodriguez
This presentation revisits a specific version of three different Ontology Design Patterns (ODPs): Class as a Property Value (CPV), Value Partition (VP) and Normalisation. The review of the CPV identifies two distinct modelling problems being tangled that prompt to decouple the pattern into two variants: a strict and a coarse CPV pattern. The examination continues with a comparative analysis among the patterns that reveals key alignments and differences at the structural and semantic level. (Related full paper available at: http://dx.doi.org/10.1007/978-3-642-33615-7_16)
Presentation for BIOF 501 at the University of British Columbia.
Haider, Bahlul, et al. "Omega: an Overlap-graph de novo Assembler for Metagenomics." Bioinformatics (2014): btu395.
Alignment of Ontology Design Patterns: Class As Property Value, Value Partiti...Bene Rodriguez
This presentation revisits a specific version of three different Ontology Design Patterns (ODPs): Class as a Property Value (CPV), Value Partition (VP) and Normalisation. The review of the CPV identifies two distinct modelling problems being tangled that prompt to decouple the pattern into two variants: a strict and a coarse CPV pattern. The examination continues with a comparative analysis among the patterns that reveals key alignments and differences at the structural and semantic level. (Related full paper available at: http://dx.doi.org/10.1007/978-3-642-33615-7_16)
Presentation for BIOF 501 at the University of British Columbia.
Haider, Bahlul, et al. "Omega: an Overlap-graph de novo Assembler for Metagenomics." Bioinformatics (2014): btu395.
Seminar: U et al. 2014 PLoS Comp. Biol. 10(4):e1003545Rosemary McCloskey
Presentation for BIOF 501 at the University of British Columbia.
ManChon, U., et al. "Prediction and Prioritization of Rare Oncogenic Mutations in the Cancer Kinome Using Novel Features and Multiple Classifiers." PLoS computational biology 10.4 (2014): e1003545.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Seminar: U et al. 2014 PLoS Comp. Biol. 10(4):e1003545Rosemary McCloskey
Presentation for BIOF 501 at the University of British Columbia.
ManChon, U., et al. "Prediction and Prioritization of Rare Oncogenic Mutations in the Cancer Kinome Using Novel Features and Multiple Classifiers." PLoS computational biology 10.4 (2014): e1003545.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
1. RidgeRace: ridge regression for continuous ancestral
character estimation on phylogenetic trees
Presentation by Rosemary McCloskey
Christina Kratsch1 Alice C. McHardy1
1Department for Algorithmic Bioinformatics, Heinrich Heine University
November 6, 2014
Kratsch & McHardy RidgeRace November 6, 2014 1 / 13
2. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
3. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
4. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
5. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
ancestral reconstruction:
estimation of characteristics of unseen
ancestral taxa
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
6. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
ancestral reconstruction:
estimation of characteristics of unseen
ancestral taxa
I discrete (eg. DNA sequence)
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
7. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
ancestral reconstruction:
estimation of characteristics of unseen
ancestral taxa
I discrete (eg. DNA sequence)
I continuous (eg. body weight)
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
8. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
ancestral reconstruction:
estimation of characteristics of unseen
ancestral taxa
I discrete (eg. DNA sequence)
I continuous (eg. body weight)
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
9. Ancestral reconstruction
?
?
phylogeny: binary tree representing
evolutionary relationships between
organisms
I leaves , observed/sampled taxa
I internal nodes , common ancestors
ancestral reconstruction:
estimation of characteristics of unseen
ancestral taxa
I discrete (eg. DNA sequence)
I continuous (eg. body weight)
http://topicpages.ploscompbiol.org/wiki/Ancestral reconstruction
Kratsch & McHardy RidgeRace November 6, 2014 2 / 13
10. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
11. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
assume
12. xed rates of evolution across some or all branches
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
13. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
assume
14. xed rates of evolution across some or all branches
use ancestral reconstruction only as a stepping stone to examine
correlated traits
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
15. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
assume
16. xed rates of evolution across some or all branches
use ancestral reconstruction only as a stepping stone to examine
correlated traits
RidgeRace:
uses phylogenetic information only (no evolutionary model)
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
17. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
assume
18. xed rates of evolution across some or all branches
use ancestral reconstruction only as a stepping stone to examine
correlated traits
RidgeRace:
uses phylogenetic information only (no evolutionary model)
allows any rate on any branch
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
19. RidgeRace
Existing ancestral reconstruction algorithms:
assume traits evolve along the tree according to a particular model
(eg. Brownian motion)
assume
20. xed rates of evolution across some or all branches
use ancestral reconstruction only as a stepping stone to examine
correlated traits
RidgeRace:
uses phylogenetic information only (no evolutionary model)
allows any rate on any branch
has ancestral reconstruction as its goal
Kratsch & McHardy RidgeRace November 6, 2014 3 / 13
21. Methods
Observed phenotypes are sums of
contributions of each ancestral
branch, plus the root.
y4 = g0 + ga + gb + gc
Kratsch & McHardy RidgeRace November 6, 2014 4 / 13
22. Methods
Observed phenotypes are sums of
contributions of each ancestral
branch, plus the root.
y4 = g0 + ga + gb + gc
Branch contributions are
proportional to branch lengths.
ga = la
23. a
Kratsch & McHardy RidgeRace November 6, 2014 4 / 13
74. ;
where
L0
ij =
(
lj j ! i
0 otherwise
:
Kratsch McHardy RidgeRace November 6, 2014 6 / 13
75. Simulations
random trees of size 30, 100, 200, 300, 400, 500
Kratsch McHardy RidgeRace November 6, 2014 7 / 13
76. Simulations
random trees of size 30, 100, 200, 300, 400, 500
phenotypic evolution by Brownian motion with 2 2 f0:5; 1; : : : ; 5g
Kratsch McHardy RidgeRace November 6, 2014 7 / 13
77. Simulations
random trees of size 30, 100, 200, 300, 400, 500
phenotypic evolution by Brownian motion with 2 2 f0:5; 1; : : : ; 5g
ancestral reconstruction with generalized least squares (GLS),
maximum likelihood (ML), and RidgeRace
Kratsch McHardy RidgeRace November 6, 2014 7 / 13
78. Simulations
random trees of size 30, 100, 200, 300, 400, 500
phenotypic evolution by Brownian motion with 2 2 f0:5; 1; : : : ; 5g
ancestral reconstruction with generalized least squares (GLS),
maximum likelihood (ML), and RidgeRace
RidgeRace comparable to other methods.
Kratsch McHardy RidgeRace November 6, 2014 7 / 13
80. Ovarian cancer data
Hierarchical clustering of 325 ovarian cancer samples.
Kratsch McHardy RidgeRace November 6, 2014 9 / 13
81. Ovarian cancer data
Hierarchical clustering of 325 ovarian cancer samples.
Reconstructed survival time; mapped mutations to ancestral nodes by
parsimony.
Kratsch McHardy RidgeRace November 6, 2014 9 / 13
82. Good points
The good:
simple approach comparable in
performance to more complex
methods
Kratsch McHardy RidgeRace November 6, 2014 10 / 13
83. Good points
The good:
simple approach comparable in
performance to more complex
methods
ancestral reconstruction
without assuming a particular
model of evolution
Kratsch McHardy RidgeRace November 6, 2014 10 / 13
84. Good points
The good:
simple approach comparable in
performance to more complex
methods
ancestral reconstruction
without assuming a particular
model of evolution
Kratsch McHardy RidgeRace November 6, 2014 10 / 13
85. Room for improvement
choice of real data was a bit
odd (not ancestral
reconstruction)
Kratsch McHardy RidgeRace November 6, 2014 11 / 13
86. Room for improvement
choice of real data was a bit
odd (not ancestral
reconstruction)
limitation is very limiting
The estimation of
87. might thus be
biased if the depth of single leaf
nodes is large compared with the
rest of the tree. We therefore
recommend RidgeRace for
approximately balanced trees.
Kratsch McHardy RidgeRace November 6, 2014 11 / 13
88. Room for improvement
choice of real data was a bit
odd (not ancestral
reconstruction)
limitation is very limiting
The estimation of
89. might thus be
biased if the depth of single leaf
nodes is large compared with the
rest of the tree. We therefore
recommend RidgeRace for
approximately balanced trees.
Bush, Robin M., et al. Eects of passage history
and sampling bias on phylogenetic reconstruction of
human in
uenza A evolution. PNAS 97.13 (2000):
6974-6980.
Kratsch McHardy RidgeRace November 6, 2014 11 / 13
91. Brownian motion
15 kg
48 kg
: : :
: : :
At each time step t, movement
drawn from a normal distribution
with mean 0 and variance 2, then
let t ! 0.
average body mass
time
10 20 30 40 50
Kratsch McHardy RidgeRace November 6, 2014 13 / 13