The presentation I gave in the plenary session at the ICCS (Presentation A3). It has been slightly modified for publishing. Please contact me if you have any questions!
This document provides an overview of three-dimensional displays, from physiological depth cues to electronic holograms. It discusses depth cues our eyes use to perceive depth, including psychological cues like occlusion and physiological cues like binocular disparity. Examples of 3D displays that provide some depth cues, like lenticular sheets, are described. The document also covers holograms, including how they can provide all depth cues by reconstructing the original wavefront. It discusses challenges like the large amount of information in holograms and methods to reduce it, like rainbow and multiplex holograms. Computer generated and electronic holograms using dynamic modulators are also summarized.
Why we don’t know how many colors there areJan Morovic
There is no definitive answer to how many colors exist because the concept of color depends on factors like the illumination, viewing conditions, and human perception. Computational models can predict color gamuts under different scenarios, but the largest gamut volume estimated is around 6.6 million colors using real measured light sources, which still may not capture all possible colors perceivable by humans. Determining all possible colors ultimately requires a color appearance model that more closely mimics the complexities of human vision.
This document discusses using boosted decision trees to select important hyperspectral bands for geology classification. It aims to reduce dimensionality and processing time while maintaining classification accuracy. The method embeds band selection within the boosting process to identify the most informative bands. Experiments are conducted on hyperspectral data from an iron ore mine to evaluate the approach.
Steerable Filters generated with the Hypercomplex Dual-Tree Wavelet Transform...Jan Wedekind
The use of wavelets in the image processing domain is still in its infancy, and largely associated with image compression. With the advent of the dual-tree hypercomplex wavelet transform (DHWT) and its improved shift invariance and directional selectivity, applications in other areas of image processing are more conceivable. This paper discusses the problems and solutions in developing the DHWT and its inverse. It also offers a practical implementation of the algorithms involved. The aim of this work is to apply the DHWT in machine vision.
Tentative work on a possible new way of feature extraction is presented. The paper shows that 2-D hypercomplex basis wavelets can be used to generate steerable filters which allow rotation as well as translation.
Initial Sintering Mechanism of Mesocarbon Microbeadsguestdc9119
The document analyzes the initial sintering mechanisms of mesocarbon microbeads through various experiments. It finds that sample shrinkage is primarily due to increasing theoretical density caused by crystallographic transformation during heating. The β-resin helps maintain particle cohesion during sintering. Sample porosity remains largely unaffected by the sintering process. Thermogravimetric analysis and heating schedule experiments help understand mass change and shrinkage over time and temperature.
Using proteomics, researchers analyzed the physiological response of the Pacific oyster Crassostrea gigas to varying levels of ocean acidification (pCO2) and mechanical stress. They identified 1,241 proteins and found oxidation-reduction was the most significantly enriched process. pCO2 had a significant effect on the proteome, while mechanical stress only impacted the proteome at high pCO2. Exposure to multiple stressors can impact an organism's ability to respond to either stressor individually. Considering multiple stressors is important for assessing responses to environmental change.
This document discusses compressive sensing and its applications for transient signal analysis. It introduces compressive sensing as a technique to reduce data measurements while preserving signal information using sparsity. Transient signals are sparse and can be represented by a small number of waveforms. The document proposes using compressive sensing for transient detection by exploiting signal sparsity and reconstructing signals from undersampled data. It describes applications in power quality analysis, audio/biomedical signals, and radar. Advantages over wavelet methods include preserving transient characteristics like amplitude and frequency.
This document provides an overview of three-dimensional displays, from physiological depth cues to electronic holograms. It discusses depth cues our eyes use to perceive depth, including psychological cues like occlusion and physiological cues like binocular disparity. Examples of 3D displays that provide some depth cues, like lenticular sheets, are described. The document also covers holograms, including how they can provide all depth cues by reconstructing the original wavefront. It discusses challenges like the large amount of information in holograms and methods to reduce it, like rainbow and multiplex holograms. Computer generated and electronic holograms using dynamic modulators are also summarized.
Why we don’t know how many colors there areJan Morovic
There is no definitive answer to how many colors exist because the concept of color depends on factors like the illumination, viewing conditions, and human perception. Computational models can predict color gamuts under different scenarios, but the largest gamut volume estimated is around 6.6 million colors using real measured light sources, which still may not capture all possible colors perceivable by humans. Determining all possible colors ultimately requires a color appearance model that more closely mimics the complexities of human vision.
This document discusses using boosted decision trees to select important hyperspectral bands for geology classification. It aims to reduce dimensionality and processing time while maintaining classification accuracy. The method embeds band selection within the boosting process to identify the most informative bands. Experiments are conducted on hyperspectral data from an iron ore mine to evaluate the approach.
Steerable Filters generated with the Hypercomplex Dual-Tree Wavelet Transform...Jan Wedekind
The use of wavelets in the image processing domain is still in its infancy, and largely associated with image compression. With the advent of the dual-tree hypercomplex wavelet transform (DHWT) and its improved shift invariance and directional selectivity, applications in other areas of image processing are more conceivable. This paper discusses the problems and solutions in developing the DHWT and its inverse. It also offers a practical implementation of the algorithms involved. The aim of this work is to apply the DHWT in machine vision.
Tentative work on a possible new way of feature extraction is presented. The paper shows that 2-D hypercomplex basis wavelets can be used to generate steerable filters which allow rotation as well as translation.
Initial Sintering Mechanism of Mesocarbon Microbeadsguestdc9119
The document analyzes the initial sintering mechanisms of mesocarbon microbeads through various experiments. It finds that sample shrinkage is primarily due to increasing theoretical density caused by crystallographic transformation during heating. The β-resin helps maintain particle cohesion during sintering. Sample porosity remains largely unaffected by the sintering process. Thermogravimetric analysis and heating schedule experiments help understand mass change and shrinkage over time and temperature.
Using proteomics, researchers analyzed the physiological response of the Pacific oyster Crassostrea gigas to varying levels of ocean acidification (pCO2) and mechanical stress. They identified 1,241 proteins and found oxidation-reduction was the most significantly enriched process. pCO2 had a significant effect on the proteome, while mechanical stress only impacted the proteome at high pCO2. Exposure to multiple stressors can impact an organism's ability to respond to either stressor individually. Considering multiple stressors is important for assessing responses to environmental change.
This document discusses compressive sensing and its applications for transient signal analysis. It introduces compressive sensing as a technique to reduce data measurements while preserving signal information using sparsity. Transient signals are sparse and can be represented by a small number of waveforms. The document proposes using compressive sensing for transient detection by exploiting signal sparsity and reconstructing signals from undersampled data. It describes applications in power quality analysis, audio/biomedical signals, and radar. Advantages over wavelet methods include preserving transient characteristics like amplitude and frequency.
This document provides an overview of basic statistics concepts and terminology. It discusses descriptive and inferential statistics, measures of central tendency (mean, median, mode), measures of variability, distributions, correlations, outliers, frequencies, t-tests, confidence intervals, research designs, hypotheses testing, and data analysis procedures. Key steps in research like research design, data collection, and statistical analysis are outlined. Descriptive statistics are used to describe data while inferential statistics investigate hypotheses about populations. Common statistical analyses and concepts are also defined.
This document outlines the key topics to be covered in an introductory econometrics course, including definitions, applications, and illustrations of econometrics. The course will define econometrics and statistics, explore how econometrics is used to test economic theories, estimate relationships, and make policy recommendations. Examples of applying econometrics to questions around class size and grades, racial discrimination in mortgage lending, taxes and cigarette smoking, and forecasting inflation will also be discussed to illustrate distinguishing econometrics from general statistics.
This document appears to be a chapter from a textbook on statistics that includes sample problems and answers. It provides the table of contents for homework problems 1-23 from Chapter 1 on introduction to statistics. The problems cover topics like identifying independent and dependent variables, distinguishing between experimental and non-experimental research studies, different types of variables, and computing statistical expressions. The document serves as a reference for students to review examples and solutions to common introductory statistics problems.
Applied Statistics : Sampling method & central limit theoremwahidsajol
This document discusses sampling methods and the central limit theorem. It provides details on types of probability sampling including simple random sampling, systematic sampling, stratified sampling, and cluster sampling. Simple random sampling involves randomly selecting items from a population so that each item has an equal chance of selection. Systematic sampling selects every kth item from a population. Stratified sampling divides a population into subgroups and then randomly samples from each subgroup. Cluster sampling divides a population into geographical clusters and randomly samples from each cluster. The document also explains that the central limit theorem states that the sampling distribution of sample means will approximate a normal distribution as sample size increases.
This document discusses the role and importance of statistics in scientific research. It begins by defining statistics as the science of learning from data and communicating uncertainty. Statistics are important for summarizing, analyzing, and drawing inferences from data in research studies. They also allow researchers to effectively present their findings and support their conclusions. The document then describes how statistics are used and are important in many fields of scientific research like biology, economics, physics, and more. It also provides examples of statistical terms commonly used in research studies and some common misuses of statistics.
Objective Determination Of Minimum Engine Mapping Requirements For Optimal SI...pmaloney1
The document discusses determining the minimum number of engine test points needed for optimal spark-ignition direct-injection engine calibration. Results found that approximately 100 spark sweeps were required to optimally calibrate the engine using a 2-stage modeling approach. A problem-solving approach was used that involved designing test matrices, developing 2-stage models, and comparing calibrations from reduced and exhaustive test designs.
Towards Probabilistic Assessment of ModularityKevin Hoffman
We propose new modularity metrics based on probabilistic computations over weighted design structure matrices, also known as dependency structure matrices.
We define how these metrics are calculated, discuss some practical applications, and conclude with future work.
This document discusses machine learning projects for classifying and clustering glass data using R. It first describes classifying glass data using ridge regression and plotting the results. It shows the classification performance is good based on error rate but poor based on ROC. Using higher order polynomials improves ROC, TPR and FPR. It also notes how to properly implement ridge regression. The document then demonstrates clustering glass data using multi-dimensional scaling, k-means clustering, and hierarchical clustering and compares the clustering to the original labels.
1) The authors developed new models for quartz-enhanced photoacoustic spectroscopy (QEPAS) sensors that account for viscous damping effects in order to enable numerical optimization of sensor design.
2) Their viscous damping model describes how fluid viscosity attenuates acoustic pressure waves and dampens the resonant mechanical deformation of the quartz tuning fork in QEPAS sensors.
3) Preliminary experimental validation showed good agreement between the model and measurements of acoustic signal strength as a function of laser beam position, though discrepancies occurred at larger distances due to unmodeled QEPAS-ROTADE interaction effects.
1) The document discusses different radiotherapy techniques for head and neck tumors including conformal radiotherapy and IMRT.
2) It describes target volumes according to ICRU guidelines and the delineation of lymph node levels in the neck.
3) Examples are provided of different conformal radiotherapy plans including an "antique" 7-field plan, a "middle age" 9-field plan, and an "art nouveau" 10-field plan with comparisons of dose distributions and organ at risk sparing.
4) Inverse planning IMRT is also discussed with an example cumulative segment dose distribution.
This document reviews different techniques for predicting solar irradiance levels including:
1) Numerical weather prediction models and statistical prediction for short to long term forecasting.
2) MOS (Model Output Statistics) techniques using sky cover products from weather centers for short term prediction.
3) Satellite-based methods comparing different approaches and error analysis for short term forecasting.
4) Signal analysis techniques including wavelet transforms and artificial neural networks combined with recurrent networks for improved prediction.
Future work proposed includes combining these methods along with normalized data and forecasts from weather centers to improve prediction accuracy across timescales.
1. The document contains graphs and tables about Wikipedia data such as the number of editors over time, article quality metrics, and calculation times for important editor identification methods.
2. It analyzes the impact of reducing the number of editors on metrics like description amount, number of articles, and calculation time.
3. Methods that consider both description amount and number of articles showed higher correlation and lower increased rank than methods using a single metric.
Natalia Restrepo-Coupe_Remotely-sensed photosynthetic phenology and ecosystem...TERN Australia
This document discusses using remotely sensed data and tower eddy covariance CO2 flux measurements to study phenology and ecosystem productivity. It notes that flux tower data can help validate remote sensing phenology products by determining if they correctly capture dates of green-up, peak growing season, end of season, and season length. The document also aims to better understand what vegetation indices mean quantitatively and their biome-specific relationships to in-situ ecosystem behavior and capacity. Improving this understanding could lead to more robust land surface models informed by remote sensing.
Paper and pencil_cosmological_calculatorSérgio Sacani
The document describes a paper-and-pencil cosmological calculator designed for the ΛCDM cosmological model. The calculator contains nomograms (graphs) for quantities like redshift, distance, size, age, and more for different redshift intervals up to z=20. It is based on cosmological parameters from the Planck mission of H0=67.15 km/s/Mpc, ΩΛ=0.683, and Ωm=0.317. To use the calculator, the user finds a known value and reads off other quantities at the same horizontal level.
This document summarizes the shape context algorithm for shape matching and object recognition. It discusses computing shape contexts, which describe the distribution of relative positions of points around a shape. Shape contexts are represented as histograms of distances and angles between points. The algorithm finds correspondences between points on two shapes by matching their shape contexts and minimizing the total cost. Additional cost terms can be added, such as color or texture differences. The algorithm is shown to be robust to noise and inexact rotations.
1. The document discusses Wikipedia and analyzes data related to editor contributions and page views over time.
2. It finds that most edits on Wikipedia are made by a small group of elite editors, with 20% of editors contributing 80% of edits, following a Zipfian distribution.
3. The analysis also examines how reducing the number of editors impacts the amount of content on Wikipedia pages and determines that credible editors who make high quality contributions can help offset reductions in editor numbers.
This new release is features-rich, as we added several new functionality: trend analysis (for linear, polynomial, logarithmic, and exponential trends), histograms, spectral analysis (discrete Fourier transform), and more. We also revised the existing correlation function (XCF) to extend support for new methods (e.g. Kendall, Spearman, etc.), and added a statistical test for examining its significance. Finally, NumXL now includes a new unit-root and stationarity test: the Augmented Dickey-Fuller (ADF) test.
http://www.spiderfinancial.com/products/numxl
FDA’s emphasis on quality by design began with the recognition that increased testing does not improve product quality (this has long been recognized in other industries).In order for quality to increase, it must be built into the product. To do this requires understanding how formulation and manufacturing process variables influence product quality.Quality by Design (QbD) is a systematic approach to pharmaceutical development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management.
This presentation - Part IV in the series- deals with the concepts of Design Space, Design of experiments and Models. This presentation was compiled from material freely available from FDA , ICH , EMEA and other free resources on the world wide web.
This document discusses analog to digital converters (ADCs). It begins by explaining that ADCs convert analog signals, which have infinite states, into digital signals that only have two states (on or off). The document then discusses the basic concepts of quantization and coding in ADCs. It provides examples of different types of ADCs including flash, dual-slope, voltage-to-frequency, and successive approximation ADCs. For successive approximation ADCs, it provides a detailed example of the conversion process. Finally, it discusses implementing an ADC using the MC68HC11A8 microcontroller and its registers.
This document provides an overview of basic statistics concepts and terminology. It discusses descriptive and inferential statistics, measures of central tendency (mean, median, mode), measures of variability, distributions, correlations, outliers, frequencies, t-tests, confidence intervals, research designs, hypotheses testing, and data analysis procedures. Key steps in research like research design, data collection, and statistical analysis are outlined. Descriptive statistics are used to describe data while inferential statistics investigate hypotheses about populations. Common statistical analyses and concepts are also defined.
This document outlines the key topics to be covered in an introductory econometrics course, including definitions, applications, and illustrations of econometrics. The course will define econometrics and statistics, explore how econometrics is used to test economic theories, estimate relationships, and make policy recommendations. Examples of applying econometrics to questions around class size and grades, racial discrimination in mortgage lending, taxes and cigarette smoking, and forecasting inflation will also be discussed to illustrate distinguishing econometrics from general statistics.
This document appears to be a chapter from a textbook on statistics that includes sample problems and answers. It provides the table of contents for homework problems 1-23 from Chapter 1 on introduction to statistics. The problems cover topics like identifying independent and dependent variables, distinguishing between experimental and non-experimental research studies, different types of variables, and computing statistical expressions. The document serves as a reference for students to review examples and solutions to common introductory statistics problems.
Applied Statistics : Sampling method & central limit theoremwahidsajol
This document discusses sampling methods and the central limit theorem. It provides details on types of probability sampling including simple random sampling, systematic sampling, stratified sampling, and cluster sampling. Simple random sampling involves randomly selecting items from a population so that each item has an equal chance of selection. Systematic sampling selects every kth item from a population. Stratified sampling divides a population into subgroups and then randomly samples from each subgroup. Cluster sampling divides a population into geographical clusters and randomly samples from each cluster. The document also explains that the central limit theorem states that the sampling distribution of sample means will approximate a normal distribution as sample size increases.
This document discusses the role and importance of statistics in scientific research. It begins by defining statistics as the science of learning from data and communicating uncertainty. Statistics are important for summarizing, analyzing, and drawing inferences from data in research studies. They also allow researchers to effectively present their findings and support their conclusions. The document then describes how statistics are used and are important in many fields of scientific research like biology, economics, physics, and more. It also provides examples of statistical terms commonly used in research studies and some common misuses of statistics.
Objective Determination Of Minimum Engine Mapping Requirements For Optimal SI...pmaloney1
The document discusses determining the minimum number of engine test points needed for optimal spark-ignition direct-injection engine calibration. Results found that approximately 100 spark sweeps were required to optimally calibrate the engine using a 2-stage modeling approach. A problem-solving approach was used that involved designing test matrices, developing 2-stage models, and comparing calibrations from reduced and exhaustive test designs.
Towards Probabilistic Assessment of ModularityKevin Hoffman
We propose new modularity metrics based on probabilistic computations over weighted design structure matrices, also known as dependency structure matrices.
We define how these metrics are calculated, discuss some practical applications, and conclude with future work.
This document discusses machine learning projects for classifying and clustering glass data using R. It first describes classifying glass data using ridge regression and plotting the results. It shows the classification performance is good based on error rate but poor based on ROC. Using higher order polynomials improves ROC, TPR and FPR. It also notes how to properly implement ridge regression. The document then demonstrates clustering glass data using multi-dimensional scaling, k-means clustering, and hierarchical clustering and compares the clustering to the original labels.
1) The authors developed new models for quartz-enhanced photoacoustic spectroscopy (QEPAS) sensors that account for viscous damping effects in order to enable numerical optimization of sensor design.
2) Their viscous damping model describes how fluid viscosity attenuates acoustic pressure waves and dampens the resonant mechanical deformation of the quartz tuning fork in QEPAS sensors.
3) Preliminary experimental validation showed good agreement between the model and measurements of acoustic signal strength as a function of laser beam position, though discrepancies occurred at larger distances due to unmodeled QEPAS-ROTADE interaction effects.
1) The document discusses different radiotherapy techniques for head and neck tumors including conformal radiotherapy and IMRT.
2) It describes target volumes according to ICRU guidelines and the delineation of lymph node levels in the neck.
3) Examples are provided of different conformal radiotherapy plans including an "antique" 7-field plan, a "middle age" 9-field plan, and an "art nouveau" 10-field plan with comparisons of dose distributions and organ at risk sparing.
4) Inverse planning IMRT is also discussed with an example cumulative segment dose distribution.
This document reviews different techniques for predicting solar irradiance levels including:
1) Numerical weather prediction models and statistical prediction for short to long term forecasting.
2) MOS (Model Output Statistics) techniques using sky cover products from weather centers for short term prediction.
3) Satellite-based methods comparing different approaches and error analysis for short term forecasting.
4) Signal analysis techniques including wavelet transforms and artificial neural networks combined with recurrent networks for improved prediction.
Future work proposed includes combining these methods along with normalized data and forecasts from weather centers to improve prediction accuracy across timescales.
1. The document contains graphs and tables about Wikipedia data such as the number of editors over time, article quality metrics, and calculation times for important editor identification methods.
2. It analyzes the impact of reducing the number of editors on metrics like description amount, number of articles, and calculation time.
3. Methods that consider both description amount and number of articles showed higher correlation and lower increased rank than methods using a single metric.
Natalia Restrepo-Coupe_Remotely-sensed photosynthetic phenology and ecosystem...TERN Australia
This document discusses using remotely sensed data and tower eddy covariance CO2 flux measurements to study phenology and ecosystem productivity. It notes that flux tower data can help validate remote sensing phenology products by determining if they correctly capture dates of green-up, peak growing season, end of season, and season length. The document also aims to better understand what vegetation indices mean quantitatively and their biome-specific relationships to in-situ ecosystem behavior and capacity. Improving this understanding could lead to more robust land surface models informed by remote sensing.
Paper and pencil_cosmological_calculatorSérgio Sacani
The document describes a paper-and-pencil cosmological calculator designed for the ΛCDM cosmological model. The calculator contains nomograms (graphs) for quantities like redshift, distance, size, age, and more for different redshift intervals up to z=20. It is based on cosmological parameters from the Planck mission of H0=67.15 km/s/Mpc, ΩΛ=0.683, and Ωm=0.317. To use the calculator, the user finds a known value and reads off other quantities at the same horizontal level.
This document summarizes the shape context algorithm for shape matching and object recognition. It discusses computing shape contexts, which describe the distribution of relative positions of points around a shape. Shape contexts are represented as histograms of distances and angles between points. The algorithm finds correspondences between points on two shapes by matching their shape contexts and minimizing the total cost. Additional cost terms can be added, such as color or texture differences. The algorithm is shown to be robust to noise and inexact rotations.
1. The document discusses Wikipedia and analyzes data related to editor contributions and page views over time.
2. It finds that most edits on Wikipedia are made by a small group of elite editors, with 20% of editors contributing 80% of edits, following a Zipfian distribution.
3. The analysis also examines how reducing the number of editors impacts the amount of content on Wikipedia pages and determines that credible editors who make high quality contributions can help offset reductions in editor numbers.
This new release is features-rich, as we added several new functionality: trend analysis (for linear, polynomial, logarithmic, and exponential trends), histograms, spectral analysis (discrete Fourier transform), and more. We also revised the existing correlation function (XCF) to extend support for new methods (e.g. Kendall, Spearman, etc.), and added a statistical test for examining its significance. Finally, NumXL now includes a new unit-root and stationarity test: the Augmented Dickey-Fuller (ADF) test.
http://www.spiderfinancial.com/products/numxl
FDA’s emphasis on quality by design began with the recognition that increased testing does not improve product quality (this has long been recognized in other industries).In order for quality to increase, it must be built into the product. To do this requires understanding how formulation and manufacturing process variables influence product quality.Quality by Design (QbD) is a systematic approach to pharmaceutical development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management.
This presentation - Part IV in the series- deals with the concepts of Design Space, Design of experiments and Models. This presentation was compiled from material freely available from FDA , ICH , EMEA and other free resources on the world wide web.
This document discusses analog to digital converters (ADCs). It begins by explaining that ADCs convert analog signals, which have infinite states, into digital signals that only have two states (on or off). The document then discusses the basic concepts of quantization and coding in ADCs. It provides examples of different types of ADCs including flash, dual-slope, voltage-to-frequency, and successive approximation ADCs. For successive approximation ADCs, it provides a detailed example of the conversion process. Finally, it discusses implementing an ADC using the MC68HC11A8 microcontroller and its registers.
Faster, More Effective Flowgraph-based Malware ClassificationSilvio Cesare
Silvio Cesare is a PhD candidate at Deakin University researching malware detection and automated vulnerability discovery. His current work extends his Masters research on fast automated unpacking and classification of malware. He presented this work last year at Ruxcon 2010. His system uses control flow graphs and q-grams of decompiled code as "birthmarks" to detect unknown malware samples that are suspiciously similar to known malware, reducing the need for signatures. He evaluated the system on 10,000 malware samples with only 10 false positives. The system provides improved effectiveness and efficiency over his previous work in 2010.
1. The document discusses the Discrete Cosine Transform (DCT), which is commonly used in image and video processing applications to decorrelate pixel data and reduce redundancy.
2. A typical image/video transmission system first applies a transformation like the DCT in the source encoder to decorrelate pixel values, followed by quantization and entropy encoding to further compress the data.
3. The DCT maps the spatially correlated pixel data into transformed coefficients that are decorrelated. This decorrelation reduces interpixel redundancy and allows more efficient compression of image and video data.
1. The document discusses the Discrete Cosine Transform (DCT), which is commonly used in image and video processing applications to decorrelate pixel data and reduce redundancy.
2. A typical image/video transmission system first applies a transformation like the DCT in the source encoder to decorrelate pixels, followed by quantization and entropy encoding to further compress the data.
3. The DCT maps the spatially correlated pixel data into transformed coefficients that are largely uncorrelated, allowing more efficient compression by reducing the number of bits needed to represent the image information.
1. The document discusses the Discrete Cosine Transform (DCT), which is commonly used in image and video processing applications to decorrelate pixel data and reduce redundancy.
2. A typical image/video transmission system first applies a transformation like the DCT in the source encoder to decorrelate pixels, followed by quantization, entropy encoding, and channel encoding for transmission.
3. The DCT aims to map spatially correlated pixel data into uncorrelated transform coefficients to exploit the fact that pixel values can be predicted from neighbors, allowing for better data compression compared to the original spatial domain representation.
Co-Chairs, Val J. Lowe, MD, and Cyrus A. Raji, MD, PhD, prepared useful Practice Aids pertaining to Alzheimer’s disease for this CME/AAPA activity titled “Alzheimer’s Disease Case Conference: Gearing Up for the Expanding Role of Neuroradiology in Diagnosis and Treatment.” For the full presentation, downloadable Practice Aids, and complete CME/AAPA information, and to apply for credit, please visit us at https://bit.ly/3PvVY25. CME/AAPA credit will be available until June 28, 2025.
Breast cancer: Post menopausal endocrine therapyDr. Sumit KUMAR
Breast cancer in postmenopausal women with hormone receptor-positive (HR+) status is a common and complex condition that necessitates a multifaceted approach to management. HR+ breast cancer means that the cancer cells grow in response to hormones such as estrogen and progesterone. This subtype is prevalent among postmenopausal women and typically exhibits a more indolent course compared to other forms of breast cancer, which allows for a variety of treatment options.
Diagnosis and Staging
The diagnosis of HR+ breast cancer begins with clinical evaluation, imaging, and biopsy. Imaging modalities such as mammography, ultrasound, and MRI help in assessing the extent of the disease. Histopathological examination and immunohistochemical staining of the biopsy sample confirm the diagnosis and hormone receptor status by identifying the presence of estrogen receptors (ER) and progesterone receptors (PR) on the tumor cells.
Staging involves determining the size of the tumor (T), the involvement of regional lymph nodes (N), and the presence of distant metastasis (M). The American Joint Committee on Cancer (AJCC) staging system is commonly used. Accurate staging is critical as it guides treatment decisions.
Treatment Options
Endocrine Therapy
Endocrine therapy is the cornerstone of treatment for HR+ breast cancer in postmenopausal women. The primary goal is to reduce the levels of estrogen or block its effects on cancer cells. Commonly used agents include:
Selective Estrogen Receptor Modulators (SERMs): Tamoxifen is a SERM that binds to estrogen receptors, blocking estrogen from stimulating breast cancer cells. It is effective but may have side effects such as increased risk of endometrial cancer and thromboembolic events.
Aromatase Inhibitors (AIs): These drugs, including anastrozole, letrozole, and exemestane, lower estrogen levels by inhibiting the aromatase enzyme, which converts androgens to estrogen in peripheral tissues. AIs are generally preferred in postmenopausal women due to their efficacy and safety profile compared to tamoxifen.
Selective Estrogen Receptor Downregulators (SERDs): Fulvestrant is a SERD that degrades estrogen receptors and is used in cases where resistance to other endocrine therapies develops.
Combination Therapies
Combining endocrine therapy with other treatments enhances efficacy. Examples include:
Endocrine Therapy with CDK4/6 Inhibitors: Palbociclib, ribociclib, and abemaciclib are CDK4/6 inhibitors that, when combined with endocrine therapy, significantly improve progression-free survival in advanced HR+ breast cancer.
Endocrine Therapy with mTOR Inhibitors: Everolimus, an mTOR inhibitor, can be added to endocrine therapy for patients who have developed resistance to aromatase inhibitors.
Chemotherapy
Chemotherapy is generally reserved for patients with high-risk features, such as large tumor size, high-grade histology, or extensive lymph node involvement. Regimens often include anthracyclines and taxanes.
Nano-gold for Cancer Therapy chemistry investigatory projectSIVAVINAYAKPK
chemistry investigatory project
The development of nanogold-based cancer therapy could revolutionize oncology by providing a more targeted, less invasive treatment option. This project contributes to the growing body of research aimed at harnessing nanotechnology for medical applications, paving the way for future clinical trials and potential commercial applications.
Cancer remains one of the leading causes of death worldwide, prompting the need for innovative treatment methods. Nanotechnology offers promising new approaches, including the use of gold nanoparticles (nanogold) for targeted cancer therapy. Nanogold particles possess unique physical and chemical properties that make them suitable for drug delivery, imaging, and photothermal therapy.
These lecture slides, by Dr Sidra Arshad, offer a simplified look into the mechanisms involved in the regulation of respiration:
Learning objectives:
1. Describe the organisation of respiratory center
2. Describe the nervous control of inspiration and respiratory rhythm
3. Describe the functions of the dorsal and respiratory groups of neurons
4. Describe the influences of the Pneumotaxic and Apneustic centers
5. Explain the role of Hering-Breur inflation reflex in regulation of inspiration
6. Explain the role of central chemoreceptors in regulation of respiration
7. Explain the role of peripheral chemoreceptors in regulation of respiration
8. Explain the regulation of respiration during exercise
9. Integrate the respiratory regulatory mechanisms
10. Describe the Cheyne-Stokes breathing
Study Resources:
1. Chapter 42, Guyton and Hall Textbook of Medical Physiology, 14th edition
2. Chapter 36, Ganong’s Review of Medical Physiology, 26th edition
3. Chapter 13, Human Physiology by Lauralee Sherwood, 9th edition
low birth weight presentation. Low birth weight (LBW) infant is defined as the one whose birth weight is less than 2500g irrespective of their gestational age. Premature birth and low birth weight(LBW) is still a serious problem in newborn. Causing high morbidity and mortality rate worldwide. The nursing care provide to low birth weight babies is crucial in promoting their overall health and development. Through careful assessment, diagnosis,, planning, and evaluation plays a vital role in ensuring these vulnerable infants receive the specialize care they need. In India every third of the infant weight less than 2500g.
Birth period, socioeconomical status, nutritional and intrauterine environment are the factors influencing low birth weight
The skin is the largest organ and its health plays a vital role among the other sense organs. The skin concerns like acne breakout, psoriasis, or anything similar along the lines, finding a qualified and experienced dermatologist becomes paramount.
Promoting Wellbeing - Applied Social Psychology - Psychology SuperNotesPsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
8 Surprising Reasons To Meditate 40 Minutes A Day That Can Change Your Life.pptxHolistified Wellness
We’re talking about Vedic Meditation, a form of meditation that has been around for at least 5,000 years. Back then, the people who lived in the Indus Valley, now known as India and Pakistan, practised meditation as a fundamental part of daily life. This knowledge that has given us yoga and Ayurveda, was known as Veda, hence the name Vedic. And though there are some written records, the practice has been passed down verbally from generation to generation.
Travel vaccination in Manchester offers comprehensive immunization services for individuals planning international trips. Expert healthcare providers administer vaccines tailored to your destination, ensuring you stay protected against various diseases. Conveniently located clinics and flexible appointment options make it easy to get the necessary shots before your journey. Stay healthy and travel with confidence by getting vaccinated in Manchester. Visit us: www.nxhealthcare.co.uk
Summer is a time for fun in the sun, but the heat and humidity can also wreak havoc on your skin. From itchy rashes to unwanted pigmentation, several skin conditions become more prevalent during these warmer months.
1. Real World Applications of
Proteochemometric Modeling
The Design of Enzyme Inhibitors and
Ligands of G-Protein Coupled Receptors
2. Contents
• Our current approach to Proteochemometric Modeling
• Part I: PCM applied to non-nucleoside reverse
transcriptase inhibitors and HIV mutants
• Part II: PCM applied to small molecules and the
Adenosine receptors
• Conclusions
3. What is PCM ?
• Proteochemometric modeling needs both a ligand
descriptor and a target descriptor
• Descriptors need to be compatible with each other and
need to be compatible with machine learning
technique...
Bio-Informatics
GJP van Westen, JK Wegner et al. MedChemComm (2011),16-30, 10.1039/C0MD00165A
4. What is PCM ?
• Proteochemometric modeling needs both a ligand
descriptor and a target descriptor
• Descriptors need to be compatible with each other and
need to be compatible with machine learning
technique...
Bio-Informatics
GJP van Westen, JK Wegner et al. MedChemComm (2011),16-30, 10.1039/C0MD00165A
5. What is PCM ?
• Proteochemometric modeling needs both a ligand
descriptor and a target descriptor
• Descriptors need to be compatible with each other and
need to be compatible with machine learning
technique...
Bio-Informatics
GJP van Westen, JK Wegner et al. MedChemComm (2011),16-30, 10.1039/C0MD00165A
6. Ligand Descriptors
• Scitegic Circular Fingerprints
▫ Circular, substructure based
fingerprints
▫ Maximal diameter of 3 bonds from
central atom
▫ Each substructure is converted to a
molecular feature
7. Target Descriptors
• Select binding site residues from full protein
sequence
• Each unique hashed feature represents one
amino acid type (comparable with circular
fingerprints)
8. Machine Learning
• Using R-statistics as integrated with Pipeline Pilot
▫ Version 2.11.1 (64-bits)
• Sampled several machine learning techniques
▫ SVM
Final method of choice
▫ PLS
▫ Random Forest
9. Real World Applications of PCM
• Part I: PCM of NNRTIs (analog series) on 14 mutants
▫ Output variable: pEC50
▫ Data set provided by Tibotec
▫ Prospectively validated
• Part II: PCM of small molecules on the Adenosine
receptors
▫ Output variable pKi
▫ ChEMBL_04 / StarLite
▫ Both human and rat data combined
▫ Prospectively validated
10. Part I: PCM applied to NNRTIs
Which inhibitor(s) show(s) the best activity spectrum and
can proceed in drug development?
• 451 HIV Reverse Transcriptase Sequence
Mean StdDev
n
pEC50 pEC50
(RT) inhibitors 1 (wt) 8.3 0.6 451
• 14 HIV RT sequences 2 6.9 0.7 259
3 7.6 0.6 444
▫ Between zero and 13 point 4 7.5 0.7 443
mutations (at NNRTI 5 7.4 0.8 429
6 6.0 0.6 316
binding site) 7 6.5 0.6 99
▫ Large differences in 8 6.9 0.7 147
compound activity on 9 8.3 0.6 222
10 7.9 0.7 252
different sequences 11 7.5 0.7 257
12 8.0 0.6 242
13 7.4 0.8 244
14 8.2 0.8 220
11. Binding Site
• Selected binding site based on point mutations present
in the different strains
• 24 residues were selected
12. Used model to predict missing values
C
o
m
p
o
u
n
d
s
Mutants
Original Dataset Completed with model
13. Prospective Validation
• Compounds have been
experimentally validated
▫ Predictions where pEC50
differs two sd from
compound average
(69 compound outliers)
▫ Predictions where pEC50
differs two sd from
sequence average
(61 sequence outliers)
• Assay validation
Completed with model
15. The Applicability Domain Concept Still
Holds in Target Space
• Prediction error similarity shows a direct correlation with
average sequence similarity to training set
R022
R0 RMSE
1 1
0.8 0.8
R0 2 0.6 0.6
RMSE
0.4 0.4
0.2 0.2
0 0
-0.2 -0.2
0.5 0.6 0.7 0.8 0.9 1
Average Sequence Similarity with Training Set
16. The Applicability Domain Concept Still
Holds in Target Space
• Prediction error similarity shows a direct correlation with
average sequence similarity to training set
R022
R0 RMSE
1 1
0.8 0.8
R0 2 0.6 0.6
RMSE
0.4 0.4
0.2 0.2
0 0
-0.2 -0.2
0.5 0.6 0.7 0.8 0.9 1
Average Sequence Similarity with Training Set
17. The Applicability Domain Concept Still
Holds in Target Space
• Prediction error similarity shows a direct correlation with
average sequence similarity to training set
R022
R0 RMSE
1 1
0.8 0.8
R0 2 0.6 0.6
RMSE
0.4 0.4
0.2 0.2
0 0
-0.2 -0.2
0.5 0.6 0.7 0.8 0.9 1
Average Sequence Similarity with Training Set
18. Does PCM outperform scaling and QSAR?
• PCM outperforms QSAR models trained with identical
descriptors on the same set
• When considering outliers, PCM outperforms scaling
• PCM can be applied to previously unseen mutants
Validation pEC50 10-NN 10-NN 10-NN
Assay PCM QSAR
Experiment scaling (both) (target) (cmpd)
R02 (Full plot) 0.88 0.69 0.69 0.31 0.41 0.21 0.28
R02 (Outliers) 0.88 0.61 0.59 0.36 0.34 0.32 0.18
RMSE (Full plot) 0.50 0.62 0.57 0.96 0.90 1.29 1.16
RMSE (Outliers) 0.50 0.52 0.58 1.06 0.72 1.39 1.29
22. Conclusions
• PCM can guide inhibitor design by predicting bioactivity
profiles, as applied here to NNRTIs
• We have shown prospectively that the performance of
PCM approaches assay reproducibility (RMSE 0.62 vs
0.50)
• Interpretation allows selection between preferred
chemical substructures and substructures to be avoided
23. Part II: PCM applied to the Adenosine
Receptors
• Model based on public data (ChEMBL_04)
• Included:
▫ Human receptor data
▫ (Historic) Rat receptor data
• Defined a single binding site (including ELs)
▫ Based on crystal structure 3EML and translated selected
residues through MSA to other receptors
• Looking for novel A2A receptor ligands taking SAR
information from other adenosine receptor subtypes
into account
25. Adenosine Receptor Data Set
• Little overlap between species
• Validation set consists of 4556 decoys and 43 known
actives
External
Receptor Human Rat Overlap Range (pKi) Decoy
Validation
A1 1635 2216 147 4.5 - 9.7 130 1139
A2A 1526 2051 215 4.5 - 10.5 57 1139
A2B 780 803 79 4.5 - 9.7 11 1139
A3 1661 327 82 4.5 - 10.0 255 1139
26. In-silico validation
• External validation on in
house compound collection
▫ Lower quality data set leads
to less predictive model
▫ Inclusion of Rat data
improves model (RMSE 0.82
vs 0.87)
• Our final model is able to
separate actives from
decoys
▫ 33 of the 43 known actives
were in the top 50
27. Prospective Validation
• Scanned ChemDiv supplier database ( > 790,000 cmpds)
• Selected 55 compounds with focus on diverse chemistry
▫ Compounds were tested in-vitro
28. Conclusions
• We have found novel compounds active (in the
nanomolar range) on the A2A receptor
▫ Hit rate ~11 %
• PCM models benefit from addition of similar targets from
other species (RMSE improves from 0.87 to 0.82)
• PCM models can make robust predictions, even when
trained on data from different labs
29. Further discussion
• Poster # 47 A. Hendriks, G.J.P. van Westen et al.
▫ Proteochemometric Modeling as a Tool to Predict Clinical
Response to Antiretroviral Therapy Based on the Dominant
Patient HIV Genotype
• Poster # 51 E.B. Lenselink, G.J.P. van Westen et al.
▫ A Global Class A GPCR Proteochemometric Model: A
Prospective Validation
• Poster # 54 R.F. Swier, G.J.P. van Westen et al.
▫ 3D-neighbourhood Protein Descriptors for Proteochemometric
Modeling
30. Acknowledgements
• Prof. Ad IJzerman • Prof. Herman van Vlijmen
• Andreas Bender • Joerg Wegner
• Olaf van den Hoven • Anik Peeters
• Rianne van der Pijl • Peggy Geluykens
• Thea Mulder • Leen Kwanten
• Henk de Vries • Inge Vereycken
• Alwin Hendriks
• Bart Lenselink
• Remco Swier
31. Real World Applications of
Proteochemometric Modeling
The Design of Enzyme Inhibitors and
Ligands of G-Protein Coupled Receptors
32.
33. Leave One Sequence Out
• By leaving out one sequence in training and validating a
trained model on that sequence, model performance on
novel mutants is emulated