This document describes a study that developed an algorithm to model photon beam data by solving the Linear Boltzmann Transport Equation (LBTE). The algorithm was implemented in a treatment planning system to generate beam profiles and percentage depth doses (PDD), which were then compared to experimentally measured data. The calculated PDDs matched the measured data closely for small fields, but showed some shift in the tail for large fields. The calculated wedge PDDs and profiles also showed some small shifts compared to measurements. Overall, there was good agreement between the calculated and measured beam data, suggesting this algorithm could be used for beam modeling or as an independent quality assurance tool.
Hybrid Algorithm for Dose Calculation in Cms Xio Treatment Planning SystemIOSR Journals
This study aimed at designing an improved hybrid algorithm by explicitly solving the linearized Boltzmann transport equation (LBTE) which is the governing equation that describes the macroscopic behaviour of radiation particles (neutrons, photons, electrons, etc). The algorithm accuracy will be evaluated using a newly designed in-house verification phantom and its results will be compared to those of the other XiO photon algorithms. The LBTE was solved numerically to compute photon transport in a medium. A programming code (algorithm) for the LBTE solution was developed and applied in the treatment planning system (TPS). The accuracy of the algorithm was evaluated by creating several plans for both the designed phantom and solid water phantom using the designed algorithm and other Xio photon algorithms. The plans were sent to a pre-calibrated Eleckta linear accelerator for measurement of absorbed dose.The results for all treatment plans using the hybrid algorithm compared to the 3 Xio photon algorithms were within 4 % limit. Calculation time for the hybrid algorithm was less in plans with larger number of beams compared to the other algorithms; however, it is higher for single beam plans. The hybrid algorithm provides comparable accuracy in treatment planning conditions to the other algorithms. This algorithm can therefore be employed in the calculation of dose in advance techniques such as IMRT and Rapid Arc by a radiotherapy centres with cmsxio treatment planning system as it is easy to implement.
A Diagnostic Analytics of Harmonic Source Signature Recognition by Using Peri...IJECEIAES
This paper presents a diagnostic analytics of harmonic source signature recognition of rectifier and inverter-based load in the distribution system with single-point measurement at the point of common coupling by utilizing Periodogram. Signature recognition pattern is used to distinguish the harmonic sources accurately by obtaining the distribution of harmonic and interharmonic components and the harmonic contribution changes. This is achieved by using the significant signature recognition of harmonic producing load obtained from analysing the harmonic contribution changes. Based on voltage and current signature analysis, the distribution of harmonic components can be divided into three zones. To distinguish between the harmonic producing loads, the harmonic components are observed at these zones to get the signature recognition pattern. The result demonstrate that periodogram technique accurately diagnose and distinguish the type of harmonic sources in the distribution system
In this article, 180 gastric images taken with Light Microscope help are used. Maximally Stable
Extremal Regions (MSER) features of the images for classification has been calculated. These MSER features
have been applied Discrete Fourier Transform (DFT) method. High-dimensional of these MSER-DFT feature
vectors is reduced to lower-dimensional with Local Tangent Space Alignment (LTSA) and Neighborhood
Preserving Embedding (NPE). When size reduction process was done, properties in 5, 10, 15, 20, 25, 30, 35, 40,
45, and 50 dimensions have been obtained. These low-dimensional data are classified by Random Forest (RF)
classification. Thus, MSER_DFT_LTSA-NPE_RF method for gastric histopathological images have been
developed. Classification results obtained with these methods have been compared. According to the other
methods, classification results for gastric histopathological images have been found to be higher.
A H YBRID C RITICAL P ATH M ETHODOLOGY – ABCP (A S B UILT C RITICAL P ...ijcsitcejournal
The edge detection in an image has become an imminent process, with the edge of an image containing the
important information related to a particular image such as the pixel intensity value, m
inimal path
deciding factors, etc. This requires a specific methodology to guide in the detection of the edges, assign a
Critical Path with a minimal path set and their respective energy partitions. The basis for this approach is
the Optimized Ant Colony A
lgorithm [2], guiding through the various optimized structure in the edge
detection of an image. Here we have considered the scenario with respect to a Medical Image, as the
information contained in the obtained medical image is of high value and requires
a redundant loss in
information pertaining to the medical image obtained through various modalities. A proper plan with a
minimal set as Critical Path, analysis with respect to the Power partitions or the Energy partitions with the
minimal set, computation
of the total time taken by the algorithm to detect an edge and retrieve the data
with respect to the edge of a medical image, cumulatively considering the cliques, trade
-
offs in the intensity
and the number of iterations required to detect an edge in an
image, with or without the presence of
suitable noise factors in the image are the necessary aspects being addressed in this paper. This paper
includes an efficient hybrid approach to address the edge detection within an image and the consideration
of vari
ous other factors, including the Shortest path out of the all the paths being produced during the
traversing of the ants within a medical image, evaluation of the time duration empirically produced by the
ants in traversing the entire image. We also constr
uct a hybrid mechanism called ABCP (As Built Critical
Path) factor to show the deviation produced by the algorithm in covering the entire medical image, for the
metrics such as the shortest paths, computation time stamps obtained eventually and the planne
d
schedules.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Hybrid Algorithm for Dose Calculation in Cms Xio Treatment Planning SystemIOSR Journals
This study aimed at designing an improved hybrid algorithm by explicitly solving the linearized Boltzmann transport equation (LBTE) which is the governing equation that describes the macroscopic behaviour of radiation particles (neutrons, photons, electrons, etc). The algorithm accuracy will be evaluated using a newly designed in-house verification phantom and its results will be compared to those of the other XiO photon algorithms. The LBTE was solved numerically to compute photon transport in a medium. A programming code (algorithm) for the LBTE solution was developed and applied in the treatment planning system (TPS). The accuracy of the algorithm was evaluated by creating several plans for both the designed phantom and solid water phantom using the designed algorithm and other Xio photon algorithms. The plans were sent to a pre-calibrated Eleckta linear accelerator for measurement of absorbed dose.The results for all treatment plans using the hybrid algorithm compared to the 3 Xio photon algorithms were within 4 % limit. Calculation time for the hybrid algorithm was less in plans with larger number of beams compared to the other algorithms; however, it is higher for single beam plans. The hybrid algorithm provides comparable accuracy in treatment planning conditions to the other algorithms. This algorithm can therefore be employed in the calculation of dose in advance techniques such as IMRT and Rapid Arc by a radiotherapy centres with cmsxio treatment planning system as it is easy to implement.
A Diagnostic Analytics of Harmonic Source Signature Recognition by Using Peri...IJECEIAES
This paper presents a diagnostic analytics of harmonic source signature recognition of rectifier and inverter-based load in the distribution system with single-point measurement at the point of common coupling by utilizing Periodogram. Signature recognition pattern is used to distinguish the harmonic sources accurately by obtaining the distribution of harmonic and interharmonic components and the harmonic contribution changes. This is achieved by using the significant signature recognition of harmonic producing load obtained from analysing the harmonic contribution changes. Based on voltage and current signature analysis, the distribution of harmonic components can be divided into three zones. To distinguish between the harmonic producing loads, the harmonic components are observed at these zones to get the signature recognition pattern. The result demonstrate that periodogram technique accurately diagnose and distinguish the type of harmonic sources in the distribution system
In this article, 180 gastric images taken with Light Microscope help are used. Maximally Stable
Extremal Regions (MSER) features of the images for classification has been calculated. These MSER features
have been applied Discrete Fourier Transform (DFT) method. High-dimensional of these MSER-DFT feature
vectors is reduced to lower-dimensional with Local Tangent Space Alignment (LTSA) and Neighborhood
Preserving Embedding (NPE). When size reduction process was done, properties in 5, 10, 15, 20, 25, 30, 35, 40,
45, and 50 dimensions have been obtained. These low-dimensional data are classified by Random Forest (RF)
classification. Thus, MSER_DFT_LTSA-NPE_RF method for gastric histopathological images have been
developed. Classification results obtained with these methods have been compared. According to the other
methods, classification results for gastric histopathological images have been found to be higher.
A H YBRID C RITICAL P ATH M ETHODOLOGY – ABCP (A S B UILT C RITICAL P ...ijcsitcejournal
The edge detection in an image has become an imminent process, with the edge of an image containing the
important information related to a particular image such as the pixel intensity value, m
inimal path
deciding factors, etc. This requires a specific methodology to guide in the detection of the edges, assign a
Critical Path with a minimal path set and their respective energy partitions. The basis for this approach is
the Optimized Ant Colony A
lgorithm [2], guiding through the various optimized structure in the edge
detection of an image. Here we have considered the scenario with respect to a Medical Image, as the
information contained in the obtained medical image is of high value and requires
a redundant loss in
information pertaining to the medical image obtained through various modalities. A proper plan with a
minimal set as Critical Path, analysis with respect to the Power partitions or the Energy partitions with the
minimal set, computation
of the total time taken by the algorithm to detect an edge and retrieve the data
with respect to the edge of a medical image, cumulatively considering the cliques, trade
-
offs in the intensity
and the number of iterations required to detect an edge in an
image, with or without the presence of
suitable noise factors in the image are the necessary aspects being addressed in this paper. This paper
includes an efficient hybrid approach to address the edge detection within an image and the consideration
of vari
ous other factors, including the Shortest path out of the all the paths being produced during the
traversing of the ants within a medical image, evaluation of the time duration empirically produced by the
ants in traversing the entire image. We also constr
uct a hybrid mechanism called ABCP (As Built Critical
Path) factor to show the deviation produced by the algorithm in covering the entire medical image, for the
metrics such as the shortest paths, computation time stamps obtained eventually and the planne
d
schedules.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Channel selection is an improvement technique to optimize EEG-based BCI performance.
In previous studies, many channel selection methods—mostly based on spatial information of signals—have
been introduced. One of these channel selection techniques is the energy calculation method. In this paper,
we introduce an energy optimization calculation method, called the energy extraction method. Energy
extraction is an extension of the energy calculation method, and is divided into two steps. The first step is
energy calculation and the second is energy selection. In the energy calculation step, l2-norm is used to
calculate channel energy, while in the energy selection method we propose three techniques: “high value”
(HV), “close to mean” (CM), and “automatic”. All proposed framework schemes for energy extraction are
applied in two types of datasets. Two classes of datasets i.e. motor movement (hand and foot movement)
and motor imagery (imagination of left- and right-hand movement) were used. The system used a Common
Spatial Pattern (CSP) method to extract EEG signal features and k-NN as a classification method to classify
the signal features with k=3. Based on the test results, all schemes for the proposed energy extraction
method yielded improved BCI performance of up to 58%. In summary, the energy extraction approach using
the CM energy selection method was found to be the best channel selection technique.
An Identification of Multiple Harmonic Sources in a Distribution System by Us...journalBEEI
The identification of multiple harmonic sources (MHS) is vital to identify the root causes and the mitigation technique for a harmonic disturbance. This paper introduces an identification technique of MHS in a power distribution system by using a time-frequency distribution (TFD) analysis known as a spectrogram. The spectrogram has advantages in term of its accuracy, a less complex algorithm, and use of low memory size compared to previous methods such as probabilistic and harmonic power flow direction. The identification of MHS is based on the significant relationship of spectral impedances, which are the fundamental impedance (Z1) and harmonic impedance (Zh) that estimate the time-frequency representation (TFR). To verify the performance of the proposed method, an IEEE test feeder with several different harmonic producing loads is simulated. It is shown that the suggested method is excellent with 100% correct identification of MHS. The method is accurate, fast and cost-efficient in the identification of MHS in power distribution arrangement.
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
SVM-PSO based Feature Selection for Improving Medical Diagnosis Reliability u...cscpconf
Improving accuracy of supervised classification algorithms in biomedical applications,
especially CADx, is one of active area of research. This paper proposes construction of rotation
forest (RF) ensemble using 20 learners over two clinical datasets namely lymphography and
backache. We propose a new feature selection strategy based on support vector machines
optimized by particle swarm optimization for relevant and minimum feature subset for obtaining
higher accuracy of ensembles. We have quantitatively analyzed 20 base learners over two
datasets and carried out the experiments with 10 fold cross validation leave-one-out strategy
and the performance of 20 classifiers are evaluated using performance metrics namely accuracy
(acc), kappa value (K), root mean square error (RMSE) and area under receiver operating
characteristics curve (ROC). Base classifiers succeeded 79.96% & 81.71% average accuracies
for lymphography & backache datasets respectively. As for RF ensembles, they produced
average accuracies of 83.72% & 85.77% for respective diseases. The paper presents promising
results using RF ensembles and provides a new direction towards construction of reliable and robust medical diagnosis systems.
Neighborhood search methods with moth optimization algorithm as a wrapper met...IJECEIAES
Feature selection methods are used to select a subset of features from data, therefore only the useful information can be mined from the samples to get better accuracy and improves the computational efficiency of the learning model. Moth-flame Optimization (MFO) algorithm is a population-based approach, that simulates the behavior of real moth in nature, one drawback of the MFO algorithm is that the solutions move toward the best solution, and it easily can be stuck in local optima as we investigated in this paper, therefore, we proposed a MFO Algorithm combined with a neighborhood search method for feature selection problems, in order to avoid the MFO algorithm getting trapped in a local optima, and helps in avoiding the premature convergence, the neighborhood search method is applied after a predefined number of unimproved iterations (the number of tries fail to improve the current solution). As a result, the proposed algorithm shows good performance when compared with the original MFO algorithm and with state-of-the-art approaches.
Improved optimization of numerical association rule mining using hybrid parti...IJECEIAES
Particle Swarm Optimization (PSO) has been applied to solve optimization problems in various fields, such as Association Rule Mining (ARM) of numerical problems. However, PSO often becomes trapped in local optima. Consequently, the results do not represent the overall optimum solutions. To address this limitation, this study aims to combine PSO with the Cauchy distribution (PARCD), which is expected to increase the global optimal value of the expanded search space. Furthermore, this study uses multiple objective functions, i.e., support, confidence, comprehensibility, interestingness and amplitude. In addition, the proposed method was evaluated using benchmark datasets, such as the Quake, Basket ball, Body fat, Pollution, and Bolt datasets. Evaluation results were compared to the results obtained by previous studies. The results indicate that the overall values of the objective functions obtained using the proposed PARCD approach are satisfactory.
Determination of Proton Energy and Dosage to Obtain SOBP Curve in the Proton ...IJERA Editor
In this research, calculations and simulations to obtain SOBP curve on the model of the thyroid tumor were done
by determining the energy and the number of protons in each beam of protons. Simulations carried out with the
help of the SRIM and TRIM code, while the computational calculations and graphics were done with python.
The SOBP curve profile obtained from this study is flatter in the region of the tumor compared to previous
results by other researchers, and can be concluded that tumor tissue receives a uniform lethal dose, while other
normal tissues surrounding receive non-lethal doses. Quantitatively, if the dose received by the tumor tissue is
expressed as 100% dose, the dose received by healthy tissue outside the tumor is a maximum of 92%.In a
further development, this research method can be applied to other tissue model based on the image of the CTSCAN
containing tumor tissue.
Traffic Light Signal Parameters Optimization Using Modification of Multielement...IJECEIAES
A strategy to optimize traffic light signal parameters is presented for solving traffic con- gestion problem using modification of the Multielement Genetic Algorithm (MEGA). The aim of this method is to improve the lack of vehicle throughput (F ) of the works called as traffic light signal parameters optimization using the MEGA and Particle Swarm Optimization (PSO). In this case, the modification of MEGA is done by adding Hash-Table for saving some best populations for accelerating the recombination process of MEGA which is shortly called as H-MEGA. The experimental results show that the H-MEGA based optimization provides better performance than MEGA and PSO based methods (improving the F F F of both MEGA and PSO based optimization methods by about 10.01% (from 82,63% to 92.64%) and 6.88% (from 85.76% to 92.64%), respectively). In addition, the H-MEGA improve significantly the real F of Ooe Toroku road network of Kumamoto City, Japan about 21.62%.
EFFECTIVE REDIRECTING OF THE MOBILE ROBOT IN A MESSED ENVIRONMENT BASED ON TH...ijfls
The use of fuzzy logic in redirecting mobile robot is based on two sets of received information. First set is the instantaneous distance of the robot from the obstacle and second set is the instantaneous information of the robot's position. For this purpose, the fuzzy rules base consists of forty-two bases, which is extracted based on the robot's distance from obstacles, and the target position relative to the instantaneous orientation of the robot. In the structure of fuzzy systems, minimal inference engine are considered. Also, Extended Kalman filter is used for localization in a noisy environment. Accordingly, the inputs of the fuzzy systems are determined based on the estimation of the localization process, the information of the obstacles center and the target position. Also, the linear acceleration and instantaneous orientation of the mobile robot are determined by the desired fuzzy structures which are applied to its kinematic model.
EFFECTIVE REDIRECTING OF THE MOBILE ROBOT IN A MESSED ENVIRONMENT BASED ON TH...Wireilla
The use of fuzzy logic in redirecting mobile robot is based on two sets of received information. First set is
the instantaneous distance of the robot from the obstacle and second set is the instantaneous information of
the robot's position. For this purpose, the fuzzy rules base consists of forty-two bases, which is extracted
based on the robot's distance from obstacles, and the target position relative to the instantaneous
orientation of the robot. In the structure of fuzzy systems, minimal inference engine are considered. Also,
Extended Kalman filter is used for localization in a noisy environment. Accordingly, the inputs of the fuzzy
systems are determined based on the estimation of the localization process, the information of the obstacles
center and the target position. Also, the linear acceleration and instantaneous orientation of the mobile
robot are determined by the desired fuzzy structures which are applied to its kinematic model.
Our team had to use signal processing techniques to find the direction of desired incoming signals at an antenna rejecting interference from other signals.
BFO – AIS: A FRAME WORK FOR MEDICAL IMAGE CLASSIFICATION USING SOFT COMPUTING...ijsc
Medical images provide diagnostic evidence/information about anatomical pathology. The growth in
database is enormous as medical digital image equipment’s like Magnetic Resonance Images (MRI),
Computed Tomography (CT), and Positron Emission Tomography CT (PET-CT) are part of clinical work.
CT images distinguish various tissues according to gray levels to help medical diagnosis. Ct is more
reliable for early tumours and haemorrhages detection as it provides anatomical information to plan radio
therapy. Medical information systems goals are to deliver information to right persons at the right time and
place to improve care process quality and efficiency. This paper proposes an Artificial Immune System
(AIS) classifier and proposed feature selection based on hybrid Bacterial Foraging Optimization (BFO)
with Local Search (LS) for medical image classification.
Metasem: An R Package For Meta-Analysis Using Structural Equation Modelling: ...Pubrica
This presentation explains about the Metasem: An r package for Meta-Analysis using Structural Equation Modelling:
1. SEM is used meta-analytical model formulated for conducting Meta-analysis which is used to analyse structural relationships. SEM can be univariate, multivariate, and three-level meta-analysis
2. Structural equation model (SEM) in general optimized and fit by using OpenMx package
3. The routine analysis of batch mode either interactively or noninteractively can be analysed by R package. Using the graphical interface like R studio is a convenient method for users to interfere with the analysis.
Learn More: https://pubrica.com/academy/
Why Pubrica:
When you order our services, we promise you the following – Plagiarism free, always on Time, outstanding customer support, written to Standard, Unlimited Revisions support and High-quality Subject Matter Experts.
Find freelance Meta-Analysis professionals, consultants, freelancers and get your project done - https://bit.ly/30V8QUK
Contact us:
Web: https://pubrica.com/
Email: sales@pubrica.com
WhatsApp : +91 9884350006
United Kingdom : +44-1143520021
Half-metallic-ferrimagnetic Sr2CrWO6 and Sr2FeReO6 materials for room tempera...IOSR Journals
Complex perovskite-like materials which include magnetic transition elements have relevance due to
the technological perspectives in the spintronics industry. In this work, we report the studies of the electronic
and magnetic characterizations of Sr2CrWO6 and Sr2FeReO6 as spintronics materials at room temperature by
using the linearized muffin-tin orbitals (LMTO) method through the atomic-sphere approximation (ASA) within
the local spin density approximation (LSDA). The interchange-correlation potential was included through the
LSDA+U technique. The band structure results at room-temperature predict half-metallic ferrimagnetic ground
state for Sr2CrWO6 and Sr2FeReO6 with total magnetic moment of 1.878 μB and 3.184 μB per formula unit,
respectively, agreement with the previous theoretical and experimental results.
Isolation and Characterization of Thermostable Protease Producing Bacteria fr...IOSR Journals
This study is a search for potential thermostable protease producing strain. Among nine protease
producing strains screened from soap industry effluent, one was selected as promising thermostable protease
producer and identified as Bacillus subtilis. The activity of the protease produced by this organism is stable up
to 70ºC. The optimum yield was achieved after 48 hours of culture, at 65ºC with the pH 8.0. The maximum
protease activity was observed at 65ºC and at pH 8.0.
Parametric sensitivity analysis of a mathematical model of facultative mutualismIOSR Journals
The complex dynamics of facultative mutualism is best described by a system of continuous non-linear first order ordinary differential equations. The methods of 1-norm, 2-norm, and infinity-norm will be used to quantify and differentiate the different forms of the sensitivity of model parameters. These contributions will be presented and discussed.
Channel selection is an improvement technique to optimize EEG-based BCI performance.
In previous studies, many channel selection methods—mostly based on spatial information of signals—have
been introduced. One of these channel selection techniques is the energy calculation method. In this paper,
we introduce an energy optimization calculation method, called the energy extraction method. Energy
extraction is an extension of the energy calculation method, and is divided into two steps. The first step is
energy calculation and the second is energy selection. In the energy calculation step, l2-norm is used to
calculate channel energy, while in the energy selection method we propose three techniques: “high value”
(HV), “close to mean” (CM), and “automatic”. All proposed framework schemes for energy extraction are
applied in two types of datasets. Two classes of datasets i.e. motor movement (hand and foot movement)
and motor imagery (imagination of left- and right-hand movement) were used. The system used a Common
Spatial Pattern (CSP) method to extract EEG signal features and k-NN as a classification method to classify
the signal features with k=3. Based on the test results, all schemes for the proposed energy extraction
method yielded improved BCI performance of up to 58%. In summary, the energy extraction approach using
the CM energy selection method was found to be the best channel selection technique.
An Identification of Multiple Harmonic Sources in a Distribution System by Us...journalBEEI
The identification of multiple harmonic sources (MHS) is vital to identify the root causes and the mitigation technique for a harmonic disturbance. This paper introduces an identification technique of MHS in a power distribution system by using a time-frequency distribution (TFD) analysis known as a spectrogram. The spectrogram has advantages in term of its accuracy, a less complex algorithm, and use of low memory size compared to previous methods such as probabilistic and harmonic power flow direction. The identification of MHS is based on the significant relationship of spectral impedances, which are the fundamental impedance (Z1) and harmonic impedance (Zh) that estimate the time-frequency representation (TFR). To verify the performance of the proposed method, an IEEE test feeder with several different harmonic producing loads is simulated. It is shown that the suggested method is excellent with 100% correct identification of MHS. The method is accurate, fast and cost-efficient in the identification of MHS in power distribution arrangement.
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
SVM-PSO based Feature Selection for Improving Medical Diagnosis Reliability u...cscpconf
Improving accuracy of supervised classification algorithms in biomedical applications,
especially CADx, is one of active area of research. This paper proposes construction of rotation
forest (RF) ensemble using 20 learners over two clinical datasets namely lymphography and
backache. We propose a new feature selection strategy based on support vector machines
optimized by particle swarm optimization for relevant and minimum feature subset for obtaining
higher accuracy of ensembles. We have quantitatively analyzed 20 base learners over two
datasets and carried out the experiments with 10 fold cross validation leave-one-out strategy
and the performance of 20 classifiers are evaluated using performance metrics namely accuracy
(acc), kappa value (K), root mean square error (RMSE) and area under receiver operating
characteristics curve (ROC). Base classifiers succeeded 79.96% & 81.71% average accuracies
for lymphography & backache datasets respectively. As for RF ensembles, they produced
average accuracies of 83.72% & 85.77% for respective diseases. The paper presents promising
results using RF ensembles and provides a new direction towards construction of reliable and robust medical diagnosis systems.
Neighborhood search methods with moth optimization algorithm as a wrapper met...IJECEIAES
Feature selection methods are used to select a subset of features from data, therefore only the useful information can be mined from the samples to get better accuracy and improves the computational efficiency of the learning model. Moth-flame Optimization (MFO) algorithm is a population-based approach, that simulates the behavior of real moth in nature, one drawback of the MFO algorithm is that the solutions move toward the best solution, and it easily can be stuck in local optima as we investigated in this paper, therefore, we proposed a MFO Algorithm combined with a neighborhood search method for feature selection problems, in order to avoid the MFO algorithm getting trapped in a local optima, and helps in avoiding the premature convergence, the neighborhood search method is applied after a predefined number of unimproved iterations (the number of tries fail to improve the current solution). As a result, the proposed algorithm shows good performance when compared with the original MFO algorithm and with state-of-the-art approaches.
Improved optimization of numerical association rule mining using hybrid parti...IJECEIAES
Particle Swarm Optimization (PSO) has been applied to solve optimization problems in various fields, such as Association Rule Mining (ARM) of numerical problems. However, PSO often becomes trapped in local optima. Consequently, the results do not represent the overall optimum solutions. To address this limitation, this study aims to combine PSO with the Cauchy distribution (PARCD), which is expected to increase the global optimal value of the expanded search space. Furthermore, this study uses multiple objective functions, i.e., support, confidence, comprehensibility, interestingness and amplitude. In addition, the proposed method was evaluated using benchmark datasets, such as the Quake, Basket ball, Body fat, Pollution, and Bolt datasets. Evaluation results were compared to the results obtained by previous studies. The results indicate that the overall values of the objective functions obtained using the proposed PARCD approach are satisfactory.
Determination of Proton Energy and Dosage to Obtain SOBP Curve in the Proton ...IJERA Editor
In this research, calculations and simulations to obtain SOBP curve on the model of the thyroid tumor were done
by determining the energy and the number of protons in each beam of protons. Simulations carried out with the
help of the SRIM and TRIM code, while the computational calculations and graphics were done with python.
The SOBP curve profile obtained from this study is flatter in the region of the tumor compared to previous
results by other researchers, and can be concluded that tumor tissue receives a uniform lethal dose, while other
normal tissues surrounding receive non-lethal doses. Quantitatively, if the dose received by the tumor tissue is
expressed as 100% dose, the dose received by healthy tissue outside the tumor is a maximum of 92%.In a
further development, this research method can be applied to other tissue model based on the image of the CTSCAN
containing tumor tissue.
Traffic Light Signal Parameters Optimization Using Modification of Multielement...IJECEIAES
A strategy to optimize traffic light signal parameters is presented for solving traffic con- gestion problem using modification of the Multielement Genetic Algorithm (MEGA). The aim of this method is to improve the lack of vehicle throughput (F ) of the works called as traffic light signal parameters optimization using the MEGA and Particle Swarm Optimization (PSO). In this case, the modification of MEGA is done by adding Hash-Table for saving some best populations for accelerating the recombination process of MEGA which is shortly called as H-MEGA. The experimental results show that the H-MEGA based optimization provides better performance than MEGA and PSO based methods (improving the F F F of both MEGA and PSO based optimization methods by about 10.01% (from 82,63% to 92.64%) and 6.88% (from 85.76% to 92.64%), respectively). In addition, the H-MEGA improve significantly the real F of Ooe Toroku road network of Kumamoto City, Japan about 21.62%.
EFFECTIVE REDIRECTING OF THE MOBILE ROBOT IN A MESSED ENVIRONMENT BASED ON TH...ijfls
The use of fuzzy logic in redirecting mobile robot is based on two sets of received information. First set is the instantaneous distance of the robot from the obstacle and second set is the instantaneous information of the robot's position. For this purpose, the fuzzy rules base consists of forty-two bases, which is extracted based on the robot's distance from obstacles, and the target position relative to the instantaneous orientation of the robot. In the structure of fuzzy systems, minimal inference engine are considered. Also, Extended Kalman filter is used for localization in a noisy environment. Accordingly, the inputs of the fuzzy systems are determined based on the estimation of the localization process, the information of the obstacles center and the target position. Also, the linear acceleration and instantaneous orientation of the mobile robot are determined by the desired fuzzy structures which are applied to its kinematic model.
EFFECTIVE REDIRECTING OF THE MOBILE ROBOT IN A MESSED ENVIRONMENT BASED ON TH...Wireilla
The use of fuzzy logic in redirecting mobile robot is based on two sets of received information. First set is
the instantaneous distance of the robot from the obstacle and second set is the instantaneous information of
the robot's position. For this purpose, the fuzzy rules base consists of forty-two bases, which is extracted
based on the robot's distance from obstacles, and the target position relative to the instantaneous
orientation of the robot. In the structure of fuzzy systems, minimal inference engine are considered. Also,
Extended Kalman filter is used for localization in a noisy environment. Accordingly, the inputs of the fuzzy
systems are determined based on the estimation of the localization process, the information of the obstacles
center and the target position. Also, the linear acceleration and instantaneous orientation of the mobile
robot are determined by the desired fuzzy structures which are applied to its kinematic model.
Our team had to use signal processing techniques to find the direction of desired incoming signals at an antenna rejecting interference from other signals.
BFO – AIS: A FRAME WORK FOR MEDICAL IMAGE CLASSIFICATION USING SOFT COMPUTING...ijsc
Medical images provide diagnostic evidence/information about anatomical pathology. The growth in
database is enormous as medical digital image equipment’s like Magnetic Resonance Images (MRI),
Computed Tomography (CT), and Positron Emission Tomography CT (PET-CT) are part of clinical work.
CT images distinguish various tissues according to gray levels to help medical diagnosis. Ct is more
reliable for early tumours and haemorrhages detection as it provides anatomical information to plan radio
therapy. Medical information systems goals are to deliver information to right persons at the right time and
place to improve care process quality and efficiency. This paper proposes an Artificial Immune System
(AIS) classifier and proposed feature selection based on hybrid Bacterial Foraging Optimization (BFO)
with Local Search (LS) for medical image classification.
Metasem: An R Package For Meta-Analysis Using Structural Equation Modelling: ...Pubrica
This presentation explains about the Metasem: An r package for Meta-Analysis using Structural Equation Modelling:
1. SEM is used meta-analytical model formulated for conducting Meta-analysis which is used to analyse structural relationships. SEM can be univariate, multivariate, and three-level meta-analysis
2. Structural equation model (SEM) in general optimized and fit by using OpenMx package
3. The routine analysis of batch mode either interactively or noninteractively can be analysed by R package. Using the graphical interface like R studio is a convenient method for users to interfere with the analysis.
Learn More: https://pubrica.com/academy/
Why Pubrica:
When you order our services, we promise you the following – Plagiarism free, always on Time, outstanding customer support, written to Standard, Unlimited Revisions support and High-quality Subject Matter Experts.
Find freelance Meta-Analysis professionals, consultants, freelancers and get your project done - https://bit.ly/30V8QUK
Contact us:
Web: https://pubrica.com/
Email: sales@pubrica.com
WhatsApp : +91 9884350006
United Kingdom : +44-1143520021
Half-metallic-ferrimagnetic Sr2CrWO6 and Sr2FeReO6 materials for room tempera...IOSR Journals
Complex perovskite-like materials which include magnetic transition elements have relevance due to
the technological perspectives in the spintronics industry. In this work, we report the studies of the electronic
and magnetic characterizations of Sr2CrWO6 and Sr2FeReO6 as spintronics materials at room temperature by
using the linearized muffin-tin orbitals (LMTO) method through the atomic-sphere approximation (ASA) within
the local spin density approximation (LSDA). The interchange-correlation potential was included through the
LSDA+U technique. The band structure results at room-temperature predict half-metallic ferrimagnetic ground
state for Sr2CrWO6 and Sr2FeReO6 with total magnetic moment of 1.878 μB and 3.184 μB per formula unit,
respectively, agreement with the previous theoretical and experimental results.
Isolation and Characterization of Thermostable Protease Producing Bacteria fr...IOSR Journals
This study is a search for potential thermostable protease producing strain. Among nine protease
producing strains screened from soap industry effluent, one was selected as promising thermostable protease
producer and identified as Bacillus subtilis. The activity of the protease produced by this organism is stable up
to 70ºC. The optimum yield was achieved after 48 hours of culture, at 65ºC with the pH 8.0. The maximum
protease activity was observed at 65ºC and at pH 8.0.
Parametric sensitivity analysis of a mathematical model of facultative mutualismIOSR Journals
The complex dynamics of facultative mutualism is best described by a system of continuous non-linear first order ordinary differential equations. The methods of 1-norm, 2-norm, and infinity-norm will be used to quantify and differentiate the different forms of the sensitivity of model parameters. These contributions will be presented and discussed.
Gravitational Blue Shift Confirms the New Phenomenon of the Vertical Aether F...IOSR Journals
In fact, the vertical position of Michelson-Morley experiment is not the only possible explanation for the new phenomenon of the vertical Aether flow into any mass or any fundamental building block. This paper shows, for the first time, that the cosmic blue shift due to a gravitational field is a direct consequence of the vertical Aether flow into any mass. The vertical Aether speeds of different stellar objects have been given, which suggest reclassifying the categories of black holes. To confirm the theory presented, new formulas for Doppler Effect in a gravitational field and its correlation with the time dilation, as derived from the General Relativity, has also been derived. The theoretical expressions corresponding to the two experimental results have been given. Also, a new prediction has been proposed, for the first time, to confirm the theories presented in this paper.
Inventory Management System and Performance of Food and Beverages Companies i...IOSR Journals
Inventory management decisions are an integral aspect of organisations. Inventory postponement as
argued by Bucklin (1965) is where a firm deliberately delays the purchase and the physical possession of
inventory items until demand or usage requirements are known with certainty. This is an effective supply chain
strategy adopted by most manufacturing organisations by reducing the inventory, and in turn reducing the cost
of obsolete stock. This study explores the relationship between inventory management and control and
performance and Food and Beverages companies in Nigeria. Secondary data were obtained from annual
financial reports and accounts of Food and Beverages companies listed on the Nigerian Stock Exchange. The
data obtained were analyzed using simple and multiple regression models. The results show that there
significant relationship between inventory management and control and the performance of Food and
Beverages companies in Nigeria. The multiple regression correlation coefficient (R) =0.996, R2=0.990 and pvalue
=0;00<0.05 The results also show the relative importance of the inventory management decisions made
by the organisation, and the implications these decisions have on the consumer. The findings show that the three
key qualities that are essential in inventory management decisions for manufacturing organisation from the
perspective of the third party logistics provider are customer satisfaction, on time delivery and order fulfillment
Energy Efficiency in IEEE 802.11 standard WLAN through MWTPPIOSR Journals
The main goal of this work is to achieve the energy efficiency in 802.11 WLAN through minimizing
the energy consumption in the network. In this proposed study, we introduced a modification in PCF for
enhancing the performance of WLAN and it is achieved by giving a new definition for the PCF function of
transmission. Generally, in PCF the way AP transmits for the various nodes is one-way during the process of
polling. The proposed function modification for PCF enhances the IEEE 802.11 standard PCF Multi-Way
Transmission PCF Protocol (MWTPP) with an improved version MWTPP it gives a low-complexity mechanism
by which the active and non-active stations in the BSS save energy during the process of polling. With the
inception of MWTPP transmissions are taken place in multi-way the access to the WLAN channel for mobile
nodes in the list generated for polling with the SIFS interval whenever the transmission in receiving data packet
from AP.
Old wine in new wineskins: Revisiting counselling in traditional Ndebele and ...IOSR Journals
The institution of counselling is present in all human communities as people share their sorrows,
mentor, empower and advise each other. The service of advising and grooming is all that counselling is. This
paper seeks to explore the institution of counselling in Shona and Ndebele traditional societies before the advent
of western formalised counselling institutions. The research sets to prove that counselling is not a new
phenomenon in these societies, that is a remnant of colonialism but rather it is an old institution that has been
window dressed with western strategies and formalisms. African traditional counselling strategies as seen in the
Shona and Ndebele examples emphasise more on the preventive forms of counselling than crisis counselling.
Advice and mentoring are prioritised in these societies as a way of helping people stay out of trouble that in
future will require therapeutic or crisis counselling. Modern day counselling has been professionalised and
commercialised and requires people to pay for it yet in Shona and Ndebele traditional societies it was part of
one’s responsibility to make sure others are well advised and counselled if they are emotionally troubled.
Professional counselling in marriage, carrier guidance, teenage grooming for example is not a new practice but
an old practice done differently like old wine in new wineskins.
Hybrid Algorithm for Dose Calculation in Cms Xio Treatment Planning SystemIOSR Journals
This study aimed at designing an improved hybrid algorithm by explicitly solving the linearized Boltzmann transport equation (LBTE) which is the governing equation that describes the macroscopic behaviour of radiation particles (neutrons, photons, electrons, etc). The algorithm accuracy will be evaluated using a newly designed in-house verification phantom and its results will be compared to those of the other XiO photon algorithms. The LBTE was solved numerically to compute photon transport in a medium. A programming code (algorithm) for the LBTE solution was developed and applied in the treatment planning system (TPS). The accuracy of the algorithm was evaluated by creating several plans for both the designed phantom and solid water phantom using the designed algorithm and other Xio photon algorithms. The plans were sent to a pre-calibrated Eleckta linear accelerator for measurement of absorbed dose.The results for all treatment plans using the hybrid algorithm compared to the 3 Xio photon algorithms were within 4 % limit. Calculation time for the hybrid algorithm was less in plans with larger number of beams compared to the other algorithms; however, it is higher for single beam plans. The hybrid algorithm provides comparable accuracy in treatment planning conditions to the other algorithms. This algorithm can therefore be employed in the calculation of dose in advance techniques such as IMRT and Rapid Arc by a radiotherapy centres with cmsxio treatment planning system as it is easy to implement.
Monte Carlo Dose Algorithm Clinical White PaperBrainlab
Learn more: https://www.brainlab.com/iplan-rt
Conventional dose calculation algorithms, such as Pencil Beam are proven effective for tumors located in homogeneous regions with similar tissue consistency such as the brain. However, these algorithms tend to overestimate the dose distribution in tumors diagnosed in extracranial regions such as in the lung and head and neck regions where large inhomogeneities exist. Due to the inconsistencies seen in current calculation methods for extracranial treatments and the need for more precise radiation delivery, research has led to the creation and integration of improved calculation methods into treatment planning software.
A magnetic resonance spectroscopy driven initialization scheme for active sha...TRS Telehealth Services
Segmentation of the prostate boundary on clinical images is useful in a large number of applications including cal
culation of prostate volume pre- and post-treatment, to detect extra-capsular spread, and for creating patient-specific
anatomical models. Manual segmentation of the prostate boundary is, however, time consuming and subject to inter
and intra-reader variability.
A Magnetic Resonance Spectroscopy Driven Initialization Scheme
for Active Shape Model Based Prostate Segmentation.
Robert Toth1, Pallavi Tiwari1, Mark Rosen2, Galen Reed3, John Kurhanewicz3,
Arjun Kalyanpur4, Sona Pungavkar5, and Anant Madabhushi1
1Rutgers, The State University of New Jersey,
Department of Biomedical Engineering, Piscataway, NJ 08854, USA.
2 University of Pennsylvania,
Department of Radiology, Philadelphia, PA 19104, USA.
3 University of California,
San Francisco, CA, USA.
4 Teleradiology Solutions,
Bangalore, 560048, India.
5 Dr. Balabhai Nanavati Hospital,
Mumbai, 400056, India.
Comparative analysis of multimodal medical image fusion using pca and wavelet...IJLT EMAS
nowadays, there are a lot of medical images and their
numbers are increasing day by day. These medical images are
stored in large database. To minimize the redundancy and
optimize the storage capacity of images, medical image fusion is
used. The main aim of medical image fusion is to combine
complementary information from multiple imaging modalities
(Eg: CT, MRI, PET etc.) of the same scene. After performing
image fusion, the resultant image is more informative and
suitable for patient diagnosis. There are some fusion techniques
which are described in this paper to obtain fused image. This
paper presents two approaches to image fusion, namely Spatial
Fusion and Transform Fusion. This paper describes Techniques
such as Principal Component Analysis which is spatial domain
technique and Discrete Wavelet Transform, Stationary Wavelet
Transform which are Transform domain techniques.
Performance metrics are implemented to evaluate the
performance of image fusion algorithm. An experimental result
shows that image fusion method based on Stationary Wavelet
Transform is better than Principal Component Analysis and
Discrete Wavelet Transform.
A verification of periodogram technique for harmonic source diagnostic analyt...TELKOMNIKA JOURNAL
A harmonic source diagnostic analytic is vital to identify the root causes and type of harmonic source in power system. This paper introduces a verification of periodogram technique to diagnose harmonic sources by using logistic regression classifier. A periodogram gives a correct and accurate classification of harmonic signals. Signature recognition pattern is used to distinguish the harmonic sources accurately by obtaining the distribution of harmonic and interharmonic components and the harmonic contribution changes. This is achieved by using the significant signature recognition of harmonic producing load obtained from the harmonic contribution changes. To verify the performance of the propose method, a logistic regression classifier will analyse the result and give the accuracy and positive rate percentage of the propose method. The adequacy of the proposed methodology is tested and verified on distribution system for several rectifier and inverter-based loads.
Particle Swarm Optimization for Nano-Particles Extraction from Supporting Mat...CSCJournals
Metallic and non-metallic nano-particles have attracted much interest concerning their wide applications. Transmission electron microscopy (TEM) is the state of the art method to characterize a nano-particle with respect to size, morphology, structure, or composition. This paper presents an efficient evolutionary computational method, particle swarm optimization (PSO), for automatic segmentation of nano-particles. A threshold-based segmentation technique is applied, where image entropy is attacked as a minimization problem to specify local and global thresholds. We are concerned with reducing wrong characterization of nano-particles due to concentration of liquid solutions or supporting material within the acquired image. The obtained results are compared with manual techniques and with previous researches in this area.
Signal Processing and Soft Computing Techniques for Single and Multiple Power...idescitation
In this paper review of various methods and
approaches that are used for the detection and classification
of power quality (PQ) events are presented. Survey has been
divided into two main categories one in which only single
events are considered and another in which combined events
are considered. Table has been also designed to present the
comparative analysis of some references. Application of
wavelet, need of power quality indices and optimization
techniques has been also described in the paper. The aim of
this paper is to show the Performance of various methodologies
so that appropriate technique would be used for the detection
and classification of PQ events.
Attenuation correction designed for PET/MR hybrid imaging frameworks along with portion making arrangements used for MR-based radiation treatment remain testing because of lacking high-energy photon weakening data. We present a new method so as to uses the learned nonlinear neighborhood descriptors also highlight coordinating toward foresee pseudo-CT pictures starting T1w along with T2w MRI information. The nonlinear neighborhood descriptors are acquired through anticipating the direct descriptors interested in the nonlinear high-dimensional space utilizing an unequivocal constituent guide also low-position guess through regulated complex regularization. The nearby neighbors of every near descriptor inside the data MR pictures are looked during an obliged spatial extent of the MR pictures among the training dataset. By that point, the pseudo-CT patches are evaluated through k-closest neighbor relapse. The planned procedure designed for pseudo-CT forecast is quantitatively broke downward on top of a dataset comprising of coordinated mind MRI along with CT pictures on or after 13 subjects.
LASSO MODELING AS AN ALTERNATIVE TO PCA BASED MULTIVARIATE MODELS TO SYSTEM W...mathsjournal
Principal component analysis (PCA) is a widespread and widely used in various areas of science such as bioinformatics, econometrics, and chemometrics among others. Once that PCA is based in the eigenvalues and the eigenvectors which are a very weak approach to high dimension systems with degrees of sparsity and in these situations the PCA is no longer a recommended procedure. Sparsity is very common in near infrared spectroscopy due to the large number of spectra required and the water absorption broad bands what makes these spectra very similar and with heavy sparsity in matrix dataset, demoting the precision and accuracy, in the multivariate modeling and within projections of data matrix in smaller dimensions. To overcoming these shortcomings the LASSO, a not PCA based method, model was applied to a NIR spectra dataset from Biodiesel and its performance was, statistically, compared with traditional multivariate modeling such as PCR and PLSR.
LASSO MODELING AS AN ALTERNATIVE TO PCA BASED MULTIVARIATE MODELS TO SYSTEM W...mathsjournal
Principal component analysis (PCA) is a widespread and widely used in various areas of science such as
bioinformatics, econometrics, and chemometrics among others. Once that PCA is based in the
eigenvalues and the eigenvectors which are a very weak approach to high dimension systems with
degrees of sparsity and in these situations the PCA is no longer a recommended procedure. Sparsity is
very common in near infrared spectroscopy due to the large number of spectra required and the water
absorption broad bands what makes these spectra very similar and with heavy sparsity in matrix dataset,
demoting the precision and accuracy, in the multivariate modeling and within projections of data matrix
in smaller dimensions. To overcoming these shortcomings the LASSO, a not PCA based method, model
was applied to a NIR spectra dataset from Biodiesel and its performance was, statistically, compared
with traditional multivariate modeling such as PCR and PLSR.
LASSO MODELING AS AN ALTERNATIVE TO PCA BASED MULTIVARIATE MODELS TO SYSTEM W...mathsjournal
Principal component analysis (PCA) is a widespread and widely used in various areas of science such as
bioinformatics, econometrics, and chemometrics among others. Once that PCA is based in the
eigenvalues and the eigenvectors which are a very weak approach to high dimension systems with
degrees of sparsity and in these situations the PCA is no longer a recommended procedure. Sparsity is
very common in near infrared spectroscopy due to the large number of spectra required and the water
absorption broad bands what makes these spectra very similar and with heavy sparsity in matrix dataset,
demoting the precision and accuracy, in the multivariate modeling and within projections of data matrix
in smaller dimensions. To overcoming these shortcomings the LASSO, a not PCA based method, model
was applied to a NIR spectra dataset from Biodiesel and its performance was, statistically, compared
with traditional multivariate modeling such as PCR and PLSR.
Metabolomics is a study of a complete set of metabolites in a specific cell or organism. Metabolomics analysis aims at simultaneous identification and quantitative analysis of intracellular metabolites. Since metabolomics is focused on a whole set of metabolites, it reflects the metabolomics activity of the organism, and hence, allows researchers to explore the biological system. An Accurate study on metabolomics relies on sensitive and sophisticated analytic platforms and bioinformatics analysis systems. With years of developing and refining our bioinformatics analysis system, Creative Proteomics offers comprehensive bioinformatics support to our clients’ research! https://www.creative-proteomics.com/services/bioinformatic-analysis-for-metabolomics-study.htm
The basics of data processing are to convert the original data file into a representation to help easily access the characteristics of each observed ion. These characteristics include ion retention time and m/z time, as well as ion intensity measurements in each raw data file. In addition to these basic features, data processing can also extract other information, such as the isotope distribution of ions. https://www.creative-proteomics.com/services/bioinformatic-univariate-analysis-service.htm
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Slide 1: Title Slide
Extrachromosomal Inheritance
Slide 2: Introduction to Extrachromosomal Inheritance
Definition: Extrachromosomal inheritance refers to the transmission of genetic material that is not found within the nucleus.
Key Components: Involves genes located in mitochondria, chloroplasts, and plasmids.
Slide 3: Mitochondrial Inheritance
Mitochondria: Organelles responsible for energy production.
Mitochondrial DNA (mtDNA): Circular DNA molecule found in mitochondria.
Inheritance Pattern: Maternally inherited, meaning it is passed from mothers to all their offspring.
Diseases: Examples include Leber’s hereditary optic neuropathy (LHON) and mitochondrial myopathy.
Slide 4: Chloroplast Inheritance
Chloroplasts: Organelles responsible for photosynthesis in plants.
Chloroplast DNA (cpDNA): Circular DNA molecule found in chloroplasts.
Inheritance Pattern: Often maternally inherited in most plants, but can vary in some species.
Examples: Variegation in plants, where leaf color patterns are determined by chloroplast DNA.
Slide 5: Plasmid Inheritance
Plasmids: Small, circular DNA molecules found in bacteria and some eukaryotes.
Features: Can carry antibiotic resistance genes and can be transferred between cells through processes like conjugation.
Significance: Important in biotechnology for gene cloning and genetic engineering.
Slide 6: Mechanisms of Extrachromosomal Inheritance
Non-Mendelian Patterns: Do not follow Mendel’s laws of inheritance.
Cytoplasmic Segregation: During cell division, organelles like mitochondria and chloroplasts are randomly distributed to daughter cells.
Heteroplasmy: Presence of more than one type of organellar genome within a cell, leading to variation in expression.
Slide 7: Examples of Extrachromosomal Inheritance
Four O’clock Plant (Mirabilis jalapa): Shows variegated leaves due to different cpDNA in leaf cells.
Petite Mutants in Yeast: Result from mutations in mitochondrial DNA affecting respiration.
Slide 8: Importance of Extrachromosomal Inheritance
Evolution: Provides insight into the evolution of eukaryotic cells.
Medicine: Understanding mitochondrial inheritance helps in diagnosing and treating mitochondrial diseases.
Agriculture: Chloroplast inheritance can be used in plant breeding and genetic modification.
Slide 9: Recent Research and Advances
Gene Editing: Techniques like CRISPR-Cas9 are being used to edit mitochondrial and chloroplast DNA.
Therapies: Development of mitochondrial replacement therapy (MRT) for preventing mitochondrial diseases.
Slide 10: Conclusion
Summary: Extrachromosomal inheritance involves the transmission of genetic material outside the nucleus and plays a crucial role in genetics, medicine, and biotechnology.
Future Directions: Continued research and technological advancements hold promise for new treatments and applications.
Slide 11: Questions and Discussion
Invite Audience: Open the floor for any questions or further discussion on the topic.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Multi-source connectivity as the driver of solar wind variability in the heli...Sérgio Sacani
The ambient solar wind that flls the heliosphere originates from multiple
sources in the solar corona and is highly structured. It is often described
as high-speed, relatively homogeneous, plasma streams from coronal
holes and slow-speed, highly variable, streams whose source regions are
under debate. A key goal of ESA/NASA’s Solar Orbiter mission is to identify
solar wind sources and understand what drives the complexity seen in the
heliosphere. By combining magnetic feld modelling and spectroscopic
techniques with high-resolution observations and measurements, we show
that the solar wind variability detected in situ by Solar Orbiter in March
2022 is driven by spatio-temporal changes in the magnetic connectivity to
multiple sources in the solar atmosphere. The magnetic feld footpoints
connected to the spacecraft moved from the boundaries of a coronal hole
to one active region (12961) and then across to another region (12957). This
is refected in the in situ measurements, which show the transition from fast
to highly Alfvénic then to slow solar wind that is disrupted by the arrival of
a coronal mass ejection. Our results describe solar wind variability at 0.5 au
but are applicable to near-Earth observatories.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...
Simulation of the Linear Boltzmann Transport Equation in Modelling Of Photon Beam Data
1. IOSR Journal of Applied Physics (IOSR-JAP)
e-ISSN: 2278-4861.Volume 5, Issue 6 (Jan. 2014), PP 72-86
www.iosrjournals.org
www.iosrjournals.org 72 | Page
Simulation of the Linear Boltzmann Transport Equation in
Modelling Of Photon Beam Data
Akpochafor M. O., Aweda M. A., Durosinmi-Etti F.A., Adeneye S.O.,
Omojola AD.
Department of Radiation Biology, Radiotherapy, Radiodiagnosis and Radiography,
College of Medicine/Lagos University Teaching Hospital, PMB 12003, Lagos, Nigeria.
Abstract: A beam data modelling algorithm was developed by solving the linear Boltzmann Transport
Equation (BTE). The Linear Boltzmann Transport Equation (LBTE) is a form of the Boltzmann transport
equation that assumes that radiation particles only interact with the matter as they are passing through matter
and not with each other. This condition is only valid when there is no external magnetic field. The numerical
method proposed by Lewis et al., [9] was used to solve the LBTE. A programming code was computed for the
LBTE and run on CMS XiO treatment planning system to generate beam data, the generated beam data were
compared to experimentally determined data. The calculated percentage depth dose (PDD) completely overlap
the measured PDDs for the small field sizes while there is a shift in the PDD tail for large field size. However
the shift is negligible. For the wedge PDDs, the shift between the measured PDDs and the calculated occurs at
the Dmax region and it increases with increase in field size. The calculated wedge profiles have a slight shift at
the shoulder compared to the measured ones and this decreases with increase in field size, unlike the PDDs.
There is also a slight shift between calculated in-plane profiles and measured ones. There is a good agreement
between the measured beam data and the calculated ones using the algorithm. This algorithm can be
implemented as an in-house algorithm for beam data modelling and also as an independent quality assurance
tool for checking the accuracy of clinical TPS algorithms with regards to beam data modelling during quality
assurance and TPS commissioning tests.
Keywords: linear Boltzmann Transport Equation (BTE), treatment planning system, algorithm, beam profile,
percentage depth dose.
I. Introduction
Computerized Treatment Planning Systems (TPS) are used in external beam radiotherapy to simulate
beam shapes and dose distribution with the intent to optimize tumour control and minimize normal
complications [7]. Treatment simulations involve the geometric and radiological aspects of the treatment and it
is based on radiation transport and optimization principles. TPS facilitate prescribed dose delivery in which a
number of the patient data and of the tumour parameters have to be taken into consideration such as the shape,
size, depth etc. Following acquisition of a new TPS, it is necessary to perform the commissioning tests, a
process which involves the entry of beam data measured at the linear accelerator into the TPS for precise
modelling of the dose distribution. Beam profiles and Percentage depth doses (PDD) are some of the most
important beam characteristics required for the commissioning of the TPS. The profiles and the PDD combine to
form the isodose curves which determine the dose distribution in the treatment plan of a radiotherapy patient.
The profile tails also determine the penumbra size of the dose distribution which plays a significant role in the
total dose distribution. An improvement in the penumbra size of the profile will lead to a better dose
distribution. The modelling of the beam data is done using the TPS software. The accuracy of the model
depends on the software parameters [3]. There are several algorithms contained in the TPS software that play
different roles, the dose calculation algorithms among these play the central role of calculating dose distribution
within the target volume [1]. Algorithms are a sequence of instructions that use a set of input patient and
dosimetric data to transforming the information into a set of desired output results [8]. For every algorithm, the
precision of the dose calculation depends on the input parameters. Different types of dose calculation algorithms
are used in modern TPS. The early TPS calculation methods were based on tabular representation of the dose
distribution obtained directly from beam measurements. As time went by, calculation models become more
sophisticated as computation power grew. TPS calculation algorithms progressively matured towards more
physically based models. The most advanced current algorithms are based on the Monte Carlo approach where
the histories of many millions of photons are traced as they interact with matter using basic physics interactions.
There is a full range of possibilities between table-based models and Monte Carlo models. For every algorithm,
the quality of the dose calculation is strongly dependent on the data or parameters used by the algorithm and its
accuracy to predict dose rely on the assumptions and approximations that the algorithm makes. The type and
2. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 73 | Page
quantity of the data needed varies according to the model. Usually, for measurement based models a lot of tables
are required, whereas for physical based models only some parameters are necessary. Good understanding of the
algorithms used within the TPS can help the user understand the strength and limitations of the particular
algorithm. This can also help the user diagnose TPS problems and develop a quality assurance (QA) protocol. It
is important to understand the general principles of the model and its implementation details. The model
parameters and input data have a significant impact on the accuracy of the calculated results. Even if the model
is able to account for a given physical effect, the actual implementation in the treatment planning software is
often simplified, leading to inaccurate or unexpected results in certain situations. Because of these situations an
independent way of checking the algorithms accuracy in beam modelling is vital to achieve a proper QA
exercise. Following the acceptance and commissioning tests of a computerized TPS, a quality assurance
program should be established to verify the performance of the system. Several ways of carrying out the quality
assurance has been proposed in the literature [1-6]. It is necessary that each Department develop its own
protocol based on the availability of relevant equipment and according to local requirements, using standard
methods as guideline.
In this study, the linear Boltzmann transport equation (LBTE) was solved following the numerical
methods described by Lewis et al. [9], a programming code was developed for the LBTE and run on a CMS
XiO treatment planning system to generate beam profiles and PDD. The generated beam data were compared
with experimentally measured and analyzed data.
II. Methods And Material
The Boltzmann transport equation (BTE) is the governing equation which describes the macroscopic
behaviour of radiation particles like photons, electrons, neutrons, protons, etc. as they travel through and interact
with matter. The Linear Boltzmann transport equation (LBTE) is a form of the BTE which assumes that
radiation particles only interact with the matter they are passing through and not with each other. This is valid
for conditions in the absence of external magnetic fields. There are different ways of solving the LBTE,
however, the numerical method proposed by Lewis et al., [9] is the method of choice for solving the equation
explicitly. The LTBE was solved using a similar method by Vassiliev et al. [15]:
,ˆ yyyyy
t
y
qq
...................................................................................................1
,)(ˆ eyeeee
R
ee
t
e
qqqS
E
...................................................................2
where
y
= Angular photon fluence (or fluence if not time integrated),
y
),,,r(
E
is a function of position r
=(x, y, z), energy E, and direction
= ( ,, )
e
= Angular electron fluence
),,r(
E
yy
q = Photon - photon scattering source,
yy
q ),,,r(
E
resulting from photon interactions.
ee
q = Election - electron scattering source,
ee
q ),,,r(
E
resulting from electron interactions.
ye
q = Photon- electron scattering source,
ye
q ),,,r(
E
resulting from electron interactions.
y
q = Extraneous photon source,
y
q ),,(
E for point source P, at position pr
This source represents all photons coming from the machine source model (Wareing et al., 2000).
e
q = Extraneous electron source,
y
q ),,(
E for point source P, as position pr
This source represents all electrons coming from the machine source model.
y
t =
y
t ),r( E
is the macroscopic photon total cross section in cm-1
e
t =
e
t ),r( E
is the macroscopic electron total cross section in cm-1
t = t ),,r( E
is the macroscopic total cross section in cm-1
SR = SR ),r( E
is the restricted collisional plus radiative stopping power,
The first term on the left hand side of equations 1 and 2 is the streaming operator. The second term on the left
hand side of equations 1 and 2 is the collision or removal operator. Equation 2 is the Boltzmann Fokker-Planck
transport equation [11-12], which is solved for the electron transport. In Equation 2, the third term on the left
3. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 74 | Page
represents the continuous slowing down (CSD) operator, which accounts for Coulomb „soft‟ electron collisions.
The right hand side of Equations 1 and 2 include the scattering, production, and the external source terms
(
y
q and
e
q ). The scattering and production sources are defined by:
),ˆ,',()ˆˆ,,(''),,,r( ''
4
'
0
ErEErddEEq yyy
s
yy
..............................................3
),ˆ,',()ˆˆ,,(''),,,r( ''
4
'
0
ErEErddEEq yye
s
ye
...............................................4
),ˆ,',()ˆˆ,,(''),,,r( ''
4
'
0
ErEErddEEq eee
s
ee
...............................................5
where
yy
s = Macroscopic photon-to-photon differential scattering cross section
ye
s = Macroscopic photon-to-electron differential production cross section
ee
s = Macroscopic electron-to-electron differential scattering cross section.
The following equation 6 represents the un-collided photon fluence:
),()ˆ,(ˆ
p
yy
unc
y
t
y
unc rrEq
................................................................................6
A property of Equation 6 was that
y
unc
can be solved for analytically. Doing so provides the following
expression for the un-collided photon angular fluence from a point source:
,
4
)ˆ,(
)ˆˆ()ˆ,,( 2
),(
,
p
rry
rr
y
unc
rr
eEq
Er
p
p
......................................................................7
where, prr
,
ˆ =
p
p
rr
rr
, and pr
and r
are the source and destination points of the ray trace, respectively.
)( prr
is the optical distance (measured in mean-free-paths) between r
and pr
.
Once the electron angular fluence was solved for all energy groups, the dose in any output grid voxel was
obtained through the following equation proposed by Siebers et al. [13]:
40
),,,(
)(
),(ˆ
Er
r
Er
ddED e
e
ED
i .....................................................................................8
where
e
ED = macroscopic electron energy deposition cross sections (in MeV/cm)
Material density (in g/cm3
).
The iteration scheme used in solving the equation is shown in figure 1 below.
Experimental determination of radiation beam profiles and PDDs
A pre-calibrated Eleckta precise clinical linear accelerator was used to collect the beam data (profile
and PDD). The profile and the PDD data were collected by following the guideline recommended by the CMS
XiO beam modelling guide [14]. The diagonal profile scans were collected at an SSD of value of 100 cm with
the largest open field size 40 x 40 cm2
and at various depths of 0.5, 1.0, 2.0, 3.0, 5.0, 10.0, 20.0, 30.0, up to the
deepest obtainable depth in cm. Scans were generated at an increment depth of 3 mm. The open field profiles
were collected for the square field sizes of 5 x 5 and 30 x 30 cm2
at depths of dmax, 5.0, 10.0, 20.0, and 30.0 cm.
Scans were made in the in-plane direction for fixed collimator. Scans were made at an increment depth of 2 mm.
Wedge aligned profile scans were collected in the wedge direction for the square field sizes of 10 x 10 and 20 x
20 cm2
at depths dmax, 5.0, 10.0, and 20.0 cm. The open field PDDs were measured at the field sizes of 3 x 3 and
30 x 30 cm2
. The scans were made at an increment depth of 1 mm up to the deepest obtainable depth of 35 cm.
The wedge field PDDs were measured at the field sizes of 5 x 5, 10 x 10 and 20 x 20 cm2
. Scans were also
acquired at an increment depth of 1 mm up to the deepest obtainable depth. Once all scans were acquired for
both 6 and 18 MeV photon beams, they were compared with the computed ones using the algorithm above. The
experimental set up for the measurement is shown in figure 2 below.
4. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 75 | Page
Fig. 1: The iteration scheme used to solve the equations is shown in the algorithm below
% File: Linearized Boltzmann Equations
% Date: 12th of March 2012
% Author: Michael Akpochafor
%The equation here perform time independent single calculation at high
resolutions
%D(vector(r))=int(mu/P)*(psi)_p{vector (r)'*A*[vector(r)-vector(r)']*d^3
*(vector (r)')}
% D(vector (r))=dose at a point
%(mu/P)=mass attenuation coefficient
%(psi)_p{vector (r)'=primary photon energy fluence
%A*[vector(r)-vector(r)']=convolution kernel, the distribution of fraction energy
Imparted per unit volume.
% (vector (r)') =TERMA at depth includes the energy retained by the photon.
% Plots a Linearized Boltzmann distribution Equations
% for dose calculation.
% THIS PROGRAMME SOLVE THE EQUATION (8)
% D(i)=int_0 ^inffy *dE*int_4*pi ^inffy *d(omega)vector *frac sigma_ED
% ^e (r(vector), E)/rho(vector)*r(vector)*psi^e (r,E,omega(all vector))
max=input('maximum time');
l=input('length');
stepx=input('dx=1')
stept=input('dt=0.1')
maxt=10;
x=0:stepx:1;
t=0:stept:maxt;
u=zeros(length(x),length(t))
r=(step)/(stepx*stepx);
u(1,:)=input('boundary temperature')*ones(size(t));
for j=1:length(t)-1;
for i=2:length(x)-1;
u(i, j+1)=r*u(i-1, j)+(1-2*r)*u(i,j)+r*u(i+1,j);
end
end
5. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 76 | Page
Fig. 2: Experimental set up for acquisition of beam profile and PDD scans.
III. Results
Scanned data for 6 MeV photon beam
Below are the results of the measured vs calculated PDDs and profiles. The coloured lines
( ) represents the calculated PDDs and profiles while the black line ( ) represents the measured ones.
Fig. 3.1a: 6 MeV PDD for 3 x 3 cm2
field size
LINAC GANTRY
REFERENCE IONIZATION
CHAMBER
FIELD IONIZATION
CHAMBER AT 5 CM DEPTH
Calculated
Measured
PDD (%)
Depth (cm)
6. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 77 | Page
Fig. 3.1b: 6 MeV PDD for 8 x 8 cm2
field size
Fig. 3.2a: 6 MeV wedge PDD for 3 x 3 cm2
field size
Calculate
dMeasured
PDD (%)
Depth (cm)
(cm)(cm)
Calculate
dMeasure
d
PDD (%)
Depth (cm)
PDD (%)
Depth (cm)
7. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 78 | Page
Fig. 3.2b: 6 MeV wedge PDD for 3 x 3 cm2
field size
Fig. 3.3 a: In-plane profile for 5 x 5 cm2
field size
OCR (%)
Calculated
Measured
Depth (cm) HB
Meas.
HB
HB
HB
Depth (cm)
Depth (cm)
8. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 79 | Page
Fig. 3.3 b: In-plane profile for 30 x 30 cm2
field size
Fig. 3.4 a: wedge profile for 3 x 3 cm2
field size
OCR (%)
Distance from CAX (cm)
HB
Meas.
HB
HB
HB
Depth (cm)
OCR (%)
%
HB
Meas.
HB
HB
HB
Depth (cm)
9. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 80 | Page
Fig. 3.4 b: wedge profile for 20 x 20 cm2
field size
Fig. 3.5 a: Cross-plane Profiles for 3 x 3 cm2
field sizes
OCR (%)
Distance from CAX (cm)OCR (%)
HB
Meas.
HB
HB
HB
Depth (cm)
HB
Meas.
HB
HB
HB
Depth (cm)
10. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 81 | Page
Fig. 3.5 b: Cross-plane Profiles for 10 x 10 cm2
field size
Scanned data for 18 MeV photon beam
Fig. 3.6a: 18 MeV PDD for 3 x 3 cm2
field size.
OCR (%)
Distance from CAX (cm)
HB
Meas.
HB
HB
HB
Depth (cm)
Calculated
Measured
PDD (%)
11. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 82 | Page
Fig. 3.6b: 18 MeV PDD for 15 x 15 cm2
field size.
Fig. 3.7a: 18 MeV PDD for 3 x 3 cm2
field size showing effect of electron contamination.
Region of electron contamination
Calculated
Measured
PDD (%)
Depth (cm)
Calculated
Measured
PDD (%)
12. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 83 | Page
Fig. 3.7b: 18 MeV PDD for 12 x 12 cm2
field size showing effect of electron contamination.
Fig. 3.8a: 18 MeV Crossplane profile for 5 x 5 cm2
field size.
Region of electron contamination
effect
Calculated
Measured
PDD (%)
Depth (cm)
OCR (%)
HB
Meas.
HB
HB
HB
Depth (cm)
13. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 84 | Page
Fig. 3.8b: 18 MeV Crossplane profile for 30 x 30 cm2
field size.
Fig. 3.9a: 18 MeV Inplane profile for 5 x 5 cm2
field size.
OCR (%)
Distance from CAX (cm)
OCR (%)
HB
Meas.
HB
HB
HB
Depth (cm)
HB
Meas.
HB
HB
HB
Depth (cm)
14. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 85 | Page
Fig. 3.9b: 18 MeV Inplane profile for 30 x 30 cm2
field size.
IV. Discussion And Conclusion
The results of the normal PDDs determined at reference depth of 10 cm for different field sizes against
the calculated PDDs for the 6 MeV photon beam are represented in figs.3.1(a) and (b). The calculated PDDs
completely overlaps the measured PDDs as observed in fig.3.1(a) for the small field size while there is a shift in
the PDD tail for large field size as observed in fig. 3.1(b). However the shift is negligible. For the wedge PDDs,
the shift between the measured PDDs and the calculated occurs at the Dmax region and it increases with increase
in field size as observed in figs 3.2(a) and (b). This may be due to inability of the algorithm to model the fluence
calculation for wedge [7-8]. The results of the normal PPDs for the 18 MeV photon beam which are presented in
figs. 3.6(a) and (b) follows a similar pattern to those of the 6 MeV photon beam. The calculated PDDs
completely overlap the measured PDDs. Electron contamination has been shown to increase in larger field sizes
and higher photon energy [10], this is evident in the result of the 18 MeV photon beam presented in figs. 3.7 (a)
and (b). The electron contamination in the smaller field size (3 x 3 cm2
) PDD in fig. 3.7(a) is much lesser
compared to that of the 12 x 12 cm2
PDD, this is because electron contamination is mostly caused by the
components (i.e. flattening filter, collimators, monitor chamber, etc) in the head of the LINAC. When collimator
opening is decrease (i.e. small field size), the electron contamination also decreases as part of the electron
source would have been shielded by the collimator blocks.
The variation of dose occurring on a line perpendicular to the central beam axis at a certain depth is
known as the beam profile. It represents how dose is altered at points away from the central beam axis. There is
also a slight shift between calculated in-plane profiles as shown in figs. 3.3(a) and (b). The calculated wedge
profile for the 6 MeV photon beam have a slight shift at the shoulder as observed in figs. 3.4(a) and (b)
OCR (%)
Distance from CAX (cm)
HB
Meas.
HB
HB
HB
Depth (cm)
15. Simulation Of The Linear Boltzmann Transport Equation In Modelling Of Photon Beam Data
www.iosrjournals.org 86 | Page
compared to the measured one and this decreases with increase in field size unlike the PDDs. The cross-plane
profiles also follow a similar pattern to the in-planes as shown in figs. 3.5 (a) and (b) for the 6 MeV photon
beam. The 18 MeV photon beam cross-plane and in-plane profiles presented in figs. 3.8(a) and (b) and 3.9 (a)
and (b) also follows similar pattern to those of the 6 MeV photon beam. However, large deviations between
calculated and measured are observed in the profiles of the larger field size (30 x 30 cm2
), this is of less concern
since most clinical field sizes are lesser. Generally, there is an improvement in the tail region of all the
calculated profiles; the region that determines the penumbra of the beam. The penumbra is the region of rapid
dose fall off located at the edge of a beam. It is usually considered to be the part of the dose that lies between 20
and 80 % of the central axis dose. The slight shift between calculated and measured PDDs and profiles is
negligible.
Generally, there is a good agreement between the measured beam data and the calculated ones as
shown in the results using the algorithm. This algorithm can be implemented as an in-house algorithm for
modelling photon beam data and also as an independent quality assurance tool for checking the accuracy of
clinical TPS algorithms with regards to beam data modelling during commissioning and annual QA checks.
References
[1]. Podgorsak EB. Radiation Oncology Physics: A handbook for Teachers and Students. Vienna: IAEA publication. 2005.
[2]. Van Dyk J, Barnett RB, Cygler JE, Shragge PC. “Commissioning and quality assurance of treatment planning computers.” Int. J.
Radiat. Oncol. Biol. Phys. 1993; 26:261–273.
[3]. Van Dyk J. “Quality Assurance.” In Treatment Planning in Radiation Oncology. Khan FM, Potish RA (Eds.). (Baltimore, MD:
Williams and Wilkins). 1997;123–146.
[4]. Shaw JE. (Ed.) “A Guide to Commissioning and Quality Control of Treatment Planning Systems.” The Institution of Physics and
Engineering in Medicine and Biology. 1994.
[5]. Fraass BA, Doppke K, Hunt M, Kutcher G, Starkschall G, Stern R, Van Dyk J. “American Association of Physicists in Medicine
Radiation Therapy Committee Task Group 53: Quality Assurance for Clinical Radiotherapy Treatment Planning.” Med. Phys.
1998; 25:1773–1829.
[6]. Fraass BA. “Quality Assurance for 3-D Treatment Planning.” In Teletherapy: Present and Future. Palta J, Mackie TR (Eds.).
Madison: Advanced Medical Publishing. 1996;253–318.
[7]. Mackie TR, Scrimger JW, Battista JJ. A convolution method of calculating dose for 15 MV x-rays. Med Phys. 1985;12:188–96.
[8]. Boyer AL, Zhu Y, Wang L, Francois P. „„Fast Fourier transform convolution calculations of x-ray isodose distributions in
inhomogeneous media,‟‟ Med. Phys. 1989;16:248–253 .
[9]. Sjogren R, Karlsson M. 1996. Electron contamination in clinical high energy photon beams. Med. Phys. 23: 1873-81.
[10]. Lewis EE, Miller WF. 1984. “Computational methods of neutron transport”, New York Wiley publication.
[11]. Wareing TA, McGhee JM, Morel JE, Pautz SD. 2001. Discontinuous Finite Element Sn Methods on Three-Dimensional
Unstructured Grids. Nucl. Sci. Engr., 138:2.
[12]. Wareing TA, Morel JE, McGheeJM. 2000. Coupled Electron-Proton Transport Methods on 3-D Unstructured Grids, Trans Am.
Nucl. Soc.83.
[13]. Siebers JV, Keall PJ, Nahum AE, Mohan R. 2000. Converting absorbed dose to medium to absorbed dose to water for Monte
Carlo based photon beam dose calculations. Phys. Med. Biol. 45:983-995.
[14]. CMS Xio Beam Modelling Guide. 2008. Xio version 4.62 treatment planning system. Stockholm: Eleckta software publication.
[15]. Vassiliev ON, Wareing TA, McGhee J, Failia G. 2010. Validation of a new grid-based Bolzmann equation solver for dose
calculation in radiotherapy with photon beams. Phys. Med. Biol. 55:581-598.