This document summarizes a study that evaluated 8 methods for detecting selection along environmental gradients in populations. Through simulations, the study tested the effectiveness of methods based on population differentiation (FST) and environmental correlation in outbreeding and selfing populations under varying levels of selection strength and sampling. The results showed that methods using environmental data were more powerful at detecting selection, even when selection was weak. Sampling more populations was also found to be most efficient. The document concludes that these correlation-based methods are better able to detect selection, but require good environmental data, and discusses applications to ongoing studies in Medicago truncatula and rice.
Trends in the Adoption of Robotic Surgery for Common Surgical ProceduresΔρ. Γιώργος K. Κασάπης
Given concerns that robotic surgery is increasing for common surgical procedures with limited evidence and unclear clinical benefit, how is the use of robotic surgery changing over time?Given concerns that robotic surgery is increasing for common surgical procedures with limited evidence and unclear clinical benefit, how is the use of robotic surgery changing over time?
In this JAMA study of 169 404 patients in 73 hospitals, the use of robotic surgery for all general surgery procedures increased from 1.8% to 15.1% from 2012 to 2018. Hospitals that launched robotic surgery programs had a broad and immediate increase in the use of robotic surgery, which was associated with a decrease in traditional laparoscopic minimally invasive surgery.
These findings highlight a need to continually monitor the adoption of robotic surgery to ensure that enthusiasm for new technology does not outpace the evidence needed to use it in the most effective clinical contexts.
AHM 2014: Integrated Data Management System for Critical Zone ObservatoriesEarthCube
Presentation by Anthony Aufdenkampe during the Addressing Data Heterogeneity, Semantic Building Bloack & CI Perspective Session on Day 2, June 25 at the EarthCube All-Hands Meeting
Trends in the Adoption of Robotic Surgery for Common Surgical ProceduresΔρ. Γιώργος K. Κασάπης
Given concerns that robotic surgery is increasing for common surgical procedures with limited evidence and unclear clinical benefit, how is the use of robotic surgery changing over time?Given concerns that robotic surgery is increasing for common surgical procedures with limited evidence and unclear clinical benefit, how is the use of robotic surgery changing over time?
In this JAMA study of 169 404 patients in 73 hospitals, the use of robotic surgery for all general surgery procedures increased from 1.8% to 15.1% from 2012 to 2018. Hospitals that launched robotic surgery programs had a broad and immediate increase in the use of robotic surgery, which was associated with a decrease in traditional laparoscopic minimally invasive surgery.
These findings highlight a need to continually monitor the adoption of robotic surgery to ensure that enthusiasm for new technology does not outpace the evidence needed to use it in the most effective clinical contexts.
AHM 2014: Integrated Data Management System for Critical Zone ObservatoriesEarthCube
Presentation by Anthony Aufdenkampe during the Addressing Data Heterogeneity, Semantic Building Bloack & CI Perspective Session on Day 2, June 25 at the EarthCube All-Hands Meeting
This presentation outlines the ARCC Climate Change Vulnerability Assessment & Adaptation Study.
The objectives of the study were to take an ecosystems approach in:
1) Identifying CC impact and vulnerabilities of rural poor and their environment - water resources, food security, livelihoods and biodiversity (fisheries and wildlife);
2) Identifying hot spots in the LMB: provide a scientific evidence base to guide the selection of pilot project sites;
3) Defining adaptation strategies to inform community and ecosystem-based adaptation pilot projects and
4) Communicating the results of the vulnerability assessment and adaptation planning.
A data-intensive assessment of the species abundance distributionElita Baldridge
Doctoral defense for Elita Baldridge from the Weecology lab at Utah State University. Slides for the talk (defense_pres.pdf) and a transcript are available on GitHub with the analysis code to fully reproduce the analyses presented. In addition, a fully closed captioned video of the talk is available on YouTube.
https://github.com/weecology/sad-comparison
https://www.youtube.com/watch?v=tkXUD0MSRCo#t=202
Improving Prediction Accuracy Results by Using Q-Statistic Algorithm in High ...rahulmonikasharma
Classification problems in high dimensional information with little sort of observations became furthercommon significantly in microarray information. The increasing amount of text data on internet sites affects the agglomerationanalysis. The text agglomeration could also be a positive analysis technique used for partitioning a huge amount of datainto clusters. Hence, the most necessary draw back that affects the text agglomeration technique is that the presenceuninformative and distributed choices in text documents. A broad class of boosting algorithms is known as actingcoordinate-wise gradient descent to attenuate some potential performs of the margins of a data set. This paperproposes a novel analysis live Q-statistic that comes with the soundness of the chosen feature set to boot to theprediction accuracy. Then we've a bent to propose the Booster of associate degree FS algorithm that enhances theworth of the Q-statistic of the algorithm applied.
Meaning & Definition of Population & Sampling, Types of Sampling - Probability & Non-Probability Sampling Techniques, Characteristics of Probability Sampling Techniques, Types of Probability Sampling Techniques, Characteristics of Non-Probability Sampling Techniques, Types of Non-Probability Sampling Techniques, Errors in Sampling, Size of sample, Application of Sampling Technique in Research
This presentation outlines the ARCC Climate Change Vulnerability Assessment & Adaptation Study.
The objectives of the study were to take an ecosystems approach in:
1) Identifying CC impact and vulnerabilities of rural poor and their environment - water resources, food security, livelihoods and biodiversity (fisheries and wildlife);
2) Identifying hot spots in the LMB: provide a scientific evidence base to guide the selection of pilot project sites;
3) Defining adaptation strategies to inform community and ecosystem-based adaptation pilot projects and
4) Communicating the results of the vulnerability assessment and adaptation planning.
Similar to THEME – 4 Detecting Selection Along Environmental Gradients: Analysis of Eight Methods and Their Effectiveness for Outbreeding and Selfing Populations
A data-intensive assessment of the species abundance distributionElita Baldridge
Doctoral defense for Elita Baldridge from the Weecology lab at Utah State University. Slides for the talk (defense_pres.pdf) and a transcript are available on GitHub with the analysis code to fully reproduce the analyses presented. In addition, a fully closed captioned video of the talk is available on YouTube.
https://github.com/weecology/sad-comparison
https://www.youtube.com/watch?v=tkXUD0MSRCo#t=202
Improving Prediction Accuracy Results by Using Q-Statistic Algorithm in High ...rahulmonikasharma
Classification problems in high dimensional information with little sort of observations became furthercommon significantly in microarray information. The increasing amount of text data on internet sites affects the agglomerationanalysis. The text agglomeration could also be a positive analysis technique used for partitioning a huge amount of datainto clusters. Hence, the most necessary draw back that affects the text agglomeration technique is that the presenceuninformative and distributed choices in text documents. A broad class of boosting algorithms is known as actingcoordinate-wise gradient descent to attenuate some potential performs of the margins of a data set. This paperproposes a novel analysis live Q-statistic that comes with the soundness of the chosen feature set to boot to theprediction accuracy. Then we've a bent to propose the Booster of associate degree FS algorithm that enhances theworth of the Q-statistic of the algorithm applied.
Meaning & Definition of Population & Sampling, Types of Sampling - Probability & Non-Probability Sampling Techniques, Characteristics of Probability Sampling Techniques, Types of Probability Sampling Techniques, Characteristics of Non-Probability Sampling Techniques, Types of Non-Probability Sampling Techniques, Errors in Sampling, Size of sample, Application of Sampling Technique in Research
For the visit of Tim Shilling, the Executive Director of the Global Coffee Quality Research Initiative we put together a presentation about our capacity and experience in coffee research
Meta-analysis in Epidemiology is:
Useful tool for epidemiological studies which investigates the relationships between certain risk factors and disease.
Useful tool to improve animal well-being and productivity
Despite of a wealth of suitable studies it is relatively underutilized in animal and veterinary science.
Meta-analysis can provide reliable results about diseases occurrence, pattern and impact in livestock.
It is utmost essential to take benefit of this statistical tool for produce. more reliable estimates of concern effects in animal and veterinary science data.
Professor Martin Wiseman presentation on The Continuous Update Project: Novel approach to reviewing mechanistic evidence on diet, nutrition, physical activity and cancer at FENS European Nutrition Conference, 20-23 October 2015 Berlin (Germany).
• SAMPLING: In statistics, quality assurance, and survey methodology, sampling is the selection of a subset (a statistical sample) of individuals from within a statistical population to estimate characteristics of the whole population. Statisticians attempt to collect samples that are representative of the population in question. Sampling has lower costs and faster data collection than measuring the entire population and can provide insights in cases where it is infeasible to measure an entire population.
• SAMPLE: A sample is defined as a smaller set of data that a researcher chooses or selects from a larger population by using a pre-defined selection method. These elements are known as sample points, sampling units, or observations. Creating a sample is an efficient method of conducting research. In most cases, it is impossible or costly and time-consuming to research the whole population. Hence, examining the sample provides insights that the researcher can apply to the entire population.
• POPULATION: A population is the entire group that you want to draw conclusions about. In research, a population doesn’t always refer to people. It can mean a group containing elements of anything you want to study, such as objects, events, organizations, countries, species, organisms, etc.
Reasons for Sampling:
a) Necessity: Sometimes it’s simply not possible to study the whole population due to its size or inaccessibility.
b) Practicality: It’s easier and more efficient to collect data from a sample.
c) Cost-effectiveness: There are fewer participant, laboratory, equipment, and researcher costs involved.
d) Manageability: Storing and running statistical analyses on smaller datasets is easier and reliable.
Sampling Advantages:
a) Reduced cost & time: Since using a sample reduces the number of people that have to be reached out to, it reduces cost and time. Imagine the time saved between researching with a population of millions vs. conducting a research study using a sample.
b) Reduced resource deployment: It is obvious that if the number of people involved in a research study is much lower due to the sample, the resources required are also much less. The workforce needed to research the sample is much less than the workforce needed to study the whole population.
c) Accuracy of data: Since the sample is indicative of the population, the data collected is accurate. Also, since the respondent is willing to participate, the survey dropout rate is much lower, which increases the validity and accuracy of the data.
d) Intensive & exhaustive data: Since there are lesser respondents, the data collected from a sample is intense and thorough. More time and effort is given to each respondent rather than having to collect data from a lot of people.
Basic introduction to Single factor experiments in Agriculture for Agricultural Students
Similar to THEME – 4 Detecting Selection Along Environmental Gradients: Analysis of Eight Methods and Their Effectiveness for Outbreeding and Selfing Populations (20)
Can we measure female social entrepreneurship? ICARDA
1st Annual Conference of the Private Sector Development Research Network:Private Enterprise and Inclusion12-13 December 2019
Presentation by Anastasia Seferiadis, Sarah Cummings and Bénédicte Gastineau
Building Climate Smart FARMERSThe Indian PerspectiveICARDA
Presented by
DR. KIRIT N SHELAT, I.A.S. (Rtd)
National Council for Climate Change, Sustainable Development and Public Leadership (NCCSD)
AHMEDABAD - INDIA
SUSTAINABLE SILVOPASTORAL RESTORATION TO PROMOTE ECOSYSTEM SERVICES IN TUNISIAICARDA
25 - 29 November 2019. Antalya, Turkey. Near East Forestry and Range Commission (NEFRC) - 24th Session
Presentation by Dr. Mounir Louhaichi
Rangeland Ecology & Management
International Center for Agricultural Research in the Dry Areas
M.Louhaichi@cigar.org
Highlights on 2019 research outputs and outcomesICARDA
18-20/11/2019. ICARDA Board of Trustees. The Program Committee of the first day was open to all staff. It included:
Highlights of recent research breakthroughs and strategic questions presented by Strategic Research Priorities (CRPs) and Cross Cutting Themes (CCTs).
The presentation is a brief highlight of the rationale for mobile data collection and the landscape of the mobile data collection platforms that exist, and the potential considerations for a choice of a choice of open data kit as a subject of the training
URI
https://hdl.handle.net/20.500.11766/10373
See also:
https://www.icarda.org/media/events/monitoring-evaluation-and-learning-data-management-and-geo-informatics-option-context
BRINGING INNOVATION AND SUSTAINABILITY ALONG THE WHOLE VALUE CHAIN IN THE MED...ICARDA
Tunis, 6-7 November 2019. Training workshop PRIMA – Partnership for Research and Innovation in the Mediterranean Area is the most ambitious joint programme to be undertaken in the frame of Euro-Mediterranean cooperation.
Presentation by Prof. M. Hachicha National Research Institute in Rural Engineering, Water and Forestry, University of Carthage | UCAR
Utilizing the reject brine from desalination for implementing integrated agri...ICARDA
14-15 November 2019. Madrid. International Symposium on the use of Non-Conventional Waters to achieve Food Security
DESALINATION - “Advancing desalination: reducing energy consumption and environmental footprint”
Presentation by Ms Dionysia Lyra, International Centre on Biosaline Agriculture (ICBA), United Arab Emirates
The role of higher and vocational education and training in developing knowle...ICARDA
25 October 2019. Africa-Europe event on higher education collaboration
Investing in skills and the young generation is key for sustainable social and economic development. Africa and Europe have been working together to develop high quality and inclusive higher education systems, exchange experience in matching skills with the demands of the labour market and to support collaboration, mobility and exchange between students and scientists within and between the African continent and Europe.
Characteristics of a winning research proposal ICARDA
Tunis, 6-7 November 2019. Training workshop PRIMA – Partnership for Research and Innovation in the Mediterranean Area is the most ambitious joint programme to be undertaken in the frame of Euro-Mediterranean cooperation.
Yehia Selmi, co-founder, Bio-wonder, Tunisia.
28 October 2019. Cairo. On the occasion of the 10th Africa Food Day Commemoration, held in joint food and nutrition security research and innovation projects within the Africa-EU Partnership.
Panel 4: Panel 4 – Idea-carriers:
Dr. Jacques Wery, Deputy Director General Research, ICARDA (CGIAR)
28 October 2019. On the occasion of the 10th Africa Food Day Commemoration, held in Egypt under the chairmanship of the African Union by Egypt in 2019, the North Africa event, organized by LEAP4FNSSA with the support of ARC/ Agricultural Research Center of the Ministry of Agriculture and Land Reclamation, launched a public private alliance of partners between Europe and North Africa to develop joint food and nutrition security research and innovation projects within the Africa-EU Partnership
Funding networks and mechanisms to support EU AU FNSSA R&I ICARDA
Dr. Bernard Mallet, Agriculture Projects Coordinator, Agence Nationale de la Recherche, France
28 October. On the occasion of the 10th Africa Food Day Commemoration, held in Egypt under the chairmanship of the African Union by Egypt in 2019, the North Africa event, organized by LEAP4FNSSA with the support of ARC/ Agricultural Research Center of the Ministry of Agriculture and Land Reclamation, launched a public private alliance of partners between Europe and North Africa to develop joint food and nutrition security research and innovation projects within the Africa-EU Partnership
https://www.icarda.org/media/events/building-research-and-innovation-collaborations-within-frame-african-european
Mapping suitable niche for cactus and legumes in diversified farming in drylandsICARDA
Presentation by Chandrashekhar Biradar and team.
16-18 October 2019. Hyderabad, India. TRUST: Humans, Machines & Ecosystems. This year’s Convention was hosted by The International Crops Research Institute for the Semi-Arid Tropics (ICRISAT). The Platform is led by the International Center for Tropical Agriculture (CIAT) and the International Food Policy Research Institute (IFPRI).
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Multi-source connectivity as the driver of solar wind variability in the heli...Sérgio Sacani
The ambient solar wind that flls the heliosphere originates from multiple
sources in the solar corona and is highly structured. It is often described
as high-speed, relatively homogeneous, plasma streams from coronal
holes and slow-speed, highly variable, streams whose source regions are
under debate. A key goal of ESA/NASA’s Solar Orbiter mission is to identify
solar wind sources and understand what drives the complexity seen in the
heliosphere. By combining magnetic feld modelling and spectroscopic
techniques with high-resolution observations and measurements, we show
that the solar wind variability detected in situ by Solar Orbiter in March
2022 is driven by spatio-temporal changes in the magnetic connectivity to
multiple sources in the solar atmosphere. The magnetic feld footpoints
connected to the spacecraft moved from the boundaries of a coronal hole
to one active region (12961) and then across to another region (12957). This
is refected in the in situ measurements, which show the transition from fast
to highly Alfvénic then to slow solar wind that is disrupted by the arrival of
a coronal mass ejection. Our results describe solar wind variability at 0.5 au
but are applicable to near-Earth observatories.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
THEME – 4 Detecting Selection Along Environmental Gradients: Analysis of Eight Methods and Their Effectiveness for Outbreeding and Selfing Populations
1. Detecting Selection Along Environmental
Gradients:
Analysis of Eight Methods and Their Effectiveness for
Outbreeding and Selfing Populations
Yves Vigouroux
Institut de Recherche pour le Développement
Montpellier, France
International Workshop on
“Applied Mathematics and Omics Technologies for Discovering Biodiversity and Genetic Resources for
Climate Change Mitigation and Adaptation to Sustain Agriculture in Drylands”
Rabat - Morocco, 24-27 June 2014
4. Principle of differentiation selection
scan
Environment – spatial variation
Neutral allele: variation due
to demographic/gene
flow/history effects
Selected gene: variation due
to demographic/gene
flow/history effects and
selectionDifferentiation
FST
5. • Methods: use or not environmental data?
• Sampling design?
• Impact of the reproduction system (selfing)?
Exemple of
environmental
gradient in
Niger, Africa
Environmental Gradients
6. Software QuantiNEMO [Neuenschwander et al. 2008 Bioinformatics]
Simulations =
- time-forward
- individual
- modèle flexible
100 populations - 2N = 200
100 unlinked neutral locus
1 linked selected locus
Selfing: {0.0, 0.95}
Different sampling strategy
Model
si
13. Methods studied
Name Reference Data
FDIST Beaumont et al. 2006 Population
DETSEL Vitalis et al. 2001 Pairs of population
FLK Bonhomme et al. 2010 Population
BAYSCAN Foll et al. 2008 Population
BAYENV Coop et al. 2010 Population
SAM Joost et al. 2006 Individual
GEE Poncet et al. 2010 Individual
Differentiation-
based methods
Correlation
based
methods
Use of
environ.
data
14. Evaluation of the methods
• Simulation of neutral and selected locus
• Use of the method to calculate the
proportion of loci detected
Simulated % of loci detected
selected
Neutral Percentage of false
positive
Expected 5%
Selected Percentage of true
positive
Expected close to 100%
20. Conclusion
- Methods based on differentiation are conservative
- Methods based on correlation more powerfull / efficient
- Requiere to have good environmental data...
- New development: non parametric correlation
- Sampling more population is the most efficient
De Mita et al., 2013, Molecular Ecology
De Mita et Siol, 2012, BMC Genetics
22. Work going on using this approach
Medicago truncatulaRice
(Oriza sp)
Natural populationTraditionnal varieties in Guinea
and Madagascar
Selfing
selfing
23. Acknowledgement
IRD, Montpellier On ongoing project:
S De Mita UAM Niamey
AC Thuillet Y Bakasso
JL Pham IS Ousseini
C Berthouly
CIRAD, Montpellier ISRA Sénégal
N Ahmadi N Kané
INRA Montpellier
L Gay
Université de Provence
S Manel
ARCAD Project
Agropolis Researcher Center for Crop Diversity and Adapation