The central theme of this article is that the usual Shannon’s entropy [1] is not suff
icient to address the
unknown Gaussian population average. A remedy is necessary. By peeling away entropy junkies, a refined
version is introduced and it is named nucleus entropy in this article. Statistical properties and advantages
of the Gaussian nucleu
s entropy are derived and utilized to interpret 2005 and 2007 waste disposals (in
1,000 tons) by fifty
-
one states (including the District of Columbia) in USA. Each state generates its own,
imports from a state for a revenue, and exports to another state wi
th a payment waste disposal [2]. Nucleus
entropy is large when the population average is large and/or when the population variance is lesser.
Nucleus entropy advocates the significance of the waste policies under four scenarios: (1) keep only
generated, (2
) keep generated with receiving in and shipping out, (3) without receiving in, and (4) without
shipping out. In the end, a few recommendations are suggested for the waste management policy makers
Application of stochastic lognormal diffusion model withAlexander Decker
This document describes a stochastic lognormal diffusion model that incorporates polynomial exogenous factors to model energy consumption in Ghana from 1999-2010. It presents:
1) A lognormal diffusion process model with drift and diffusion coefficients that depend on exogenous factors affecting consumption.
2) Maximum likelihood estimators for the drift and diffusion coefficients based on energy consumption data.
3) Hypothesis tests to evaluate the effect of exogenous factors on consumption patterns, showing Ghana's energy consumption has an upward trend over time.
1) The document presents a mathematical model for analyzing lung function measurements in asthmatic children under different stresses like histamine challenge, exercise challenge, and on a control day.
2) The inverse Gaussian distribution and its hazard rate function are utilized to model the critical maximum points of the lung function curves.
3) The results from applying the mathematical model show that the failure rate curve has an upside down bathtub shape, with the maximum hazard rate highest for exercise challenges and lowest for the control day.
The document summarizes the creation of a new student portal at Imperial Valley College. It was created to meet new federal data collection requirements and address limitations of the existing system. The portal was built using the free and open-source Joomla content management system to encapsulate existing features and allow additional new features to be added inexpensively. Since launching in July, over 200 users have accessed the portal and initial usage data suggests it will help the college meet federal requirements while providing added value to students, faculty and staff.
This document discusses noise, information theory, and entropy as they relate to communication systems. It introduces key concepts such as additive noise channels, random processes like white Gaussian noise, signal-to-noise ratio, analog vs. digital signals, information theory measures like entropy and channel capacity, and coding techniques including Huffman coding. Huffman coding is presented as an optimal variable-length coding scheme that assigns codes based on symbol probabilities to achieve an average code length close to the entropy limit.
This document provides an introduction to information theory, including definitions of key concepts like entropy, marginal entropy, joint entropy, and conditional entropy. It uses examples like coin flips to illustrate these concepts. Entropy quantifies the average amount of information or uncertainty in a random variable, while joint entropy looks at multiple variables together. Conditional entropy considers the information remaining in one variable after another is observed. The document outlines these topics to introduce fundamental information theory concepts.
The student portal provides access to administrative tasks, academic records, financial information, and other resources for students. Students can log in using their student ID and password. The portal home page has announcements and links to other areas. Students should ensure they are viewing the correct academic term. Administrative functions include viewing transcripts, checking grades, paying tuition, and updating contact information. Academic resources include class schedules and tracking degree progress through the degree audit system. Financial features allow viewing bills and accepting/declining financial aid. Assistance is available by contacting an advisor or the helpdesk if any issues arise.
The document describes a student portal created using ASP.NET and C# for a college. The portal aims to provide students a single interface to access various services like viewing results, sending messages, managing schedules and tasks, downloading documents, and searching for faculty. It also allows administrators to add, update, and manage student information and results online. The portal was developed using tools like Visual Studio .NET, SQL Server and utilizes IIS for the web server. Future enhancements may include online fee payments, event listings, discussion forums and email integration.
This document provides an overview of information theory and coding. It begins with an introduction and table of contents. The main topics covered include:
1. Probability theory and random variables
2. Random processes
3. Elements of information theory
4. Source encoding
5. Error control coding for digital communication systems
6. Error detection and correction
7. Field algebra
8. Linear block codes
9. Cyclic codes
10. BCH codes
Each chapter provides explanations of key concepts and mathematical models within each topic area. Examples and practice problems are also included to help readers understand the concepts. The overall document serves as a comprehensive textbook on information theory and coding.
Application of stochastic lognormal diffusion model withAlexander Decker
This document describes a stochastic lognormal diffusion model that incorporates polynomial exogenous factors to model energy consumption in Ghana from 1999-2010. It presents:
1) A lognormal diffusion process model with drift and diffusion coefficients that depend on exogenous factors affecting consumption.
2) Maximum likelihood estimators for the drift and diffusion coefficients based on energy consumption data.
3) Hypothesis tests to evaluate the effect of exogenous factors on consumption patterns, showing Ghana's energy consumption has an upward trend over time.
1) The document presents a mathematical model for analyzing lung function measurements in asthmatic children under different stresses like histamine challenge, exercise challenge, and on a control day.
2) The inverse Gaussian distribution and its hazard rate function are utilized to model the critical maximum points of the lung function curves.
3) The results from applying the mathematical model show that the failure rate curve has an upside down bathtub shape, with the maximum hazard rate highest for exercise challenges and lowest for the control day.
The document summarizes the creation of a new student portal at Imperial Valley College. It was created to meet new federal data collection requirements and address limitations of the existing system. The portal was built using the free and open-source Joomla content management system to encapsulate existing features and allow additional new features to be added inexpensively. Since launching in July, over 200 users have accessed the portal and initial usage data suggests it will help the college meet federal requirements while providing added value to students, faculty and staff.
This document discusses noise, information theory, and entropy as they relate to communication systems. It introduces key concepts such as additive noise channels, random processes like white Gaussian noise, signal-to-noise ratio, analog vs. digital signals, information theory measures like entropy and channel capacity, and coding techniques including Huffman coding. Huffman coding is presented as an optimal variable-length coding scheme that assigns codes based on symbol probabilities to achieve an average code length close to the entropy limit.
This document provides an introduction to information theory, including definitions of key concepts like entropy, marginal entropy, joint entropy, and conditional entropy. It uses examples like coin flips to illustrate these concepts. Entropy quantifies the average amount of information or uncertainty in a random variable, while joint entropy looks at multiple variables together. Conditional entropy considers the information remaining in one variable after another is observed. The document outlines these topics to introduce fundamental information theory concepts.
The student portal provides access to administrative tasks, academic records, financial information, and other resources for students. Students can log in using their student ID and password. The portal home page has announcements and links to other areas. Students should ensure they are viewing the correct academic term. Administrative functions include viewing transcripts, checking grades, paying tuition, and updating contact information. Academic resources include class schedules and tracking degree progress through the degree audit system. Financial features allow viewing bills and accepting/declining financial aid. Assistance is available by contacting an advisor or the helpdesk if any issues arise.
The document describes a student portal created using ASP.NET and C# for a college. The portal aims to provide students a single interface to access various services like viewing results, sending messages, managing schedules and tasks, downloading documents, and searching for faculty. It also allows administrators to add, update, and manage student information and results online. The portal was developed using tools like Visual Studio .NET, SQL Server and utilizes IIS for the web server. Future enhancements may include online fee payments, event listings, discussion forums and email integration.
This document provides an overview of information theory and coding. It begins with an introduction and table of contents. The main topics covered include:
1. Probability theory and random variables
2. Random processes
3. Elements of information theory
4. Source encoding
5. Error control coding for digital communication systems
6. Error detection and correction
7. Field algebra
8. Linear block codes
9. Cyclic codes
10. BCH codes
Each chapter provides explanations of key concepts and mathematical models within each topic area. Examples and practice problems are also included to help readers understand the concepts. The overall document serves as a comprehensive textbook on information theory and coding.
Capture recapture estimation for elusive events with two listsAlexander Decker
The document presents a new capture-recapture model for estimating the size of elusive epidemiologic events using two lists or data sources. It compares three estimators - the proposed estimator N, the traditional Petersen estimator Ns, and another estimator Nq. Both the Akaike Information Criterion and Mean Absolute Deviation, based on simulation studies, showed that the proposed estimator N performs better by being more consistent and accurate compared to the other estimators, especially when recaptures/relistings are low. For elusive populations where recaptures tend to be low, the Petersen estimator breaks down. The proposed model uses a "coverage probability" that accounts for heterogeneity in capture probabilities across individuals
This document proposes a modified Mann-Whitney U test that adjusts for tied observations between two sampled populations. The modification develops a test statistic (W) that intrinsically accounts for ties in its expected value and variance. W is calculated based on the ranks assigned to observations from each population when pooled and ranked together. The null hypothesis is that the expected value of W is equal to 0, indicating the populations have equal medians. This can be tested using a chi-square distribution with the test statistic comparing W to its variance, which is also adjusted for ties. The method allows analysis when populations are on an ordinal scale and avoids problems with ties encountered in traditional methods.
Talk given at Kobayashi-Maskawa Institute, Nagoya University, Japan.Peter Coles
Cosmic Anomalies
Observational measurements of the temperature variation of the Cosmic Microwave Background across the celestial sphere made by the Wilkinson Microwave Anisotropy Probe (WMAP) and, more recently Planck, have played a major part in establishing the standard "concordance" cosmological model. However, extensive statistical analysis of these data have also revealed some tantalising anomalies whose interpretation within the standard framework is by no means clear. In this talk, I'll discuss the significance of the evidence for some aspects of this anomalous behaviour, offer some possible theoretical models, and suggest how future measurements may provide firmer conclusions.
Computing Bayesian posterior with empirical likelihood in population geneticsPierre Pudlo
This document discusses using empirical likelihood to approximate Bayesian posteriors for population genetics models with intractable likelihoods. It introduces empirical likelihood, which defines parameters in terms of moment constraints and profiles a likelihood function without specifying a full parametric model. For population genetics models, pairwise composite likelihoods based on allelic states can be used, defining moment constraints in terms of pairwise maximum likelihood estimates. Numerical examples show empirical likelihood can provide better approximations than incorrectly specified parametric models when the true model is complex.
The document discusses using information theory techniques to infer causal relationships between variables from time series data, even when the data is incomplete, noisy, and irregularly sampled. It presents a method called information transfer (IT) that can quantify the relative strength and directionality of coupling between variables. The method is demonstrated on simulated data and two geological case studies: 1) analyzing the influence of northern and southern hemisphere climate change on East Asian monsoon intensity over the last glacial cycle, and 2) characterizing relationships among four Phanerozoic seawater isotope records. The document concludes that while IT between proxies cannot prove direct causality, it shows promise as a data-driven approach to infer causal interactions from geological records.
SEMI-PARAMETRIC ESTIMATION OF Px,y({(x,y)/x >y}) FOR THE POWER FUNCTION DISTR...IJESM JOURNAL
The stress-strength model describes the life of a component which has a random strength X and is subjected to random stress Y, in the context of reliability. The component will function satisfactorily whenever X>Y and it fails at the instant the stress applied to it exceeds the strength. R=P(Y<X) is a measure of component reliability .In this paper, we obtain semi parametric estimators of the reliability under stress- strength model for the Power function distribution under complete and censored samples. We illustrate the performance of the estimators using a simulation study.
The document discusses various methods for constructing confidence intervals for estimating multinomial proportions. It aims to analyze the propensity for aberrations (i.e. unrealistic bounds like negative values) in the interval estimates across different classical and Bayesian methods. Specifically, it provides the mathematical conditions under which each method may produce aberrant interval limits, such as zero-width intervals or bounds exceeding 0 and 1, especially for small sample counts. The document also develops an R program to facilitate computational implementation of the various methods for applied analysis of multinomial data.
NEW SCENARIO FOR TRANSITION TO SLOW 3-D TURBULENCE PART I.SLOW 1-D TURBULENCE...ijrap
Analyticalnon-perturbative study of thethree-dimensional nonlinear stochastic partialdifferential equation
with additive thermal noise, analogous to thatproposed by V.N.Nikolaevskii [1]-[5] to describelongitudinal
seismic waves, ispresented. Theequation has a threshold of short-waveinstability and symmetry, providing
longwavedynamics.New mechanism of quantum chaos generating in nonlineardynamical systemswith
infinite number of degrees of freedom is proposed. The hypothesis is said,that physical turbulence could be
identifiedwith quantum chaos of considered type. It is shown that the additive thermal noise destabilizes
dramatically the ground state of theNikolaevskii system thus causing it to make a direct transition from a
spatially uniform to a turbulent state
NEW SCENARIO FOR TRANSITION TO SLOW 3-D TURBULENCE PART I.SLOW 1-D TURBULENCE...ijrap
This document analyzes the 3D and 1D stochastic Nikolaevskii systems, which model turbulence. It introduces the 3D and 1D stochastic partial differential equations that were obtained by adding small white noise to the original non-stochastic Nikolaevskii systems. The key findings are: 1) Numerical integration of the 1D system is actually solving the stochastic model due to inherent computational noise. 2) Even small noise can significantly impact turbulent modes and change system behavior dramatically. 3) Developed turbulence in physical systems may be characterized as quantum chaos driven by thermal fluctuations based on modeling of the stochastic systems.
This document summarizes research on tachyon inflation in the context of Dirac-Born-Infeld (DBI) and Randall-Sundrum type II (RSII) models. It introduces tachyon fields which arise in string theory, describes how they can drive inflation in various scenarios, and presents numerical solutions of the equations of motion. Key aspects covered include tachyon potentials, equations for tachyon inflation, extending the analysis to an anti-de Sitter braneworld model based on RSII, and calculating observational parameters like the scalar spectral index and tensor-to-scalar ratio from the solutions.
1) The document discusses using quantum probes to indirectly extract information about complex quantum systems like ultracold atomic gases, without directly measuring the system.
2) One method is to use an impurity atom as a qubit probe immersed in a 2D Bose-Einstein condensate. Interactions between the probe and gas induce decoherence on the probe that depends on properties of the gas like dimensionality and phase fluctuations, allowing characterization of the gas.
3) The non-Markovianity of the probe's dynamics, quantified by information flow between the probe and gas, can reveal information about the gas without directly measuring it. Positive information flow indicates non-Markovian dynamics and backflow of information
Optimization of Manufacturing of Circuits XOR to Decrease Their Dimensionsijrap
We analyzed possibility to increase density of elements of integrated circuits. We illustrate the possibility
by example of manufacturing of a circuit XOR. Framework the paper we consider a heterostructure which
includes into itself a substrate and an epitaxial layer with specific configuration. Several required areas of
the heterostructure have been doped by diffusion and/or ion implantation. After that we consider optimized
annealing of dopant and/or radiation defects.
PICO presentation at EGU 2014 about the use of measures from information theory to visualise uncertainty in kinematic structural models - and to estimate where additional data would help reduce uncertainties. Some nice counter-intuitive results ;-)
On New Root Finding Algorithms for Solving Nonlinear Transcendental EquationsAI Publications
In this paper, we present new iterative algorithms to find a root of the given nonlinear transcendental equations. In the proposed algorithms, we use nonlinear Taylor’s polynomial interpolation and a modified error correction term with a fixed-point concept. We also investigated for possible extension of the higher order iterative algorithms in single variable to higher dimension. Several numerical examples are presented to illustrate the proposed algorithms.
Partitions and entropies intel third draftAnna Movsheva
The document discusses statistical properties of the entropy function of a random partition. It introduces the concept of counting the number of partitions of a set X that have entropy less than or equal to some value x. This counting function is denoted Θ(p, x). The document hypothesizes that the normalized counting function θ(p, x) = Θ(p, x)/Θ(H(p, X)) can be approximated by a cumulative Gaussian distribution, with the mean and standard deviation of the distribution being functions of the probability distribution p. Evidence for this conjecture is provided by computer simulations.
This document is a physics problem set from MIT's 8.044 Statistical Physics I course in Spring 2004. It contains 5 problems related to statistical physics and probability distributions. Problem 1 considers the probability distribution and properties of the position of a particle undergoing simple harmonic motion. Problem 2 examines the probability distribution of the x-component of angular momentum for a quantum mechanical system. Problem 3 analyzes a mixed probability distribution describing the energy of an electron. Problem 4 involves finding and sketching the time-dependent probability distribution for the position of a particle given its wavefunction. Problem 5 concerns Bose-Einstein statistics and calculating properties of the distribution that describes the number of photons in a given mode.
This document summarizes key probability distributions: binomial, Poisson, and normal. The binomial distribution describes the number of successes in fixed number of trials where the probability of success is constant. The Poisson distribution approximates the binomial when the number of trials is large and the probability of success is small. The normal distribution describes many continuous random variables and is symmetric with two parameters: mean and standard deviation. The document also discusses when binomial and Poisson distributions can be approximated as normal distributions.
Knowledge of cause-effect relationships is central to the field of climate science, supporting mechanistic understanding, observational sampling strategies, experimental design, model development and model prediction. While the major causal connections in our planet's climate system are already known, there is still potential for new discoveries in some areas. The purpose of this talk is to make this community familiar with a variety of available tools to discover potential cause-effect relationships from observed or simulation data. Some of these tools are already in use in climate science, others are just emerging in recent years. None of them are miracle solutions, but many can provide important pieces of information to climate scientists. An important way to use such methods is to generate cause-effect hypotheses that climate experts can then study further. In this talk we will (1) introduce key concepts important for causal analysis; (2) discuss some methods based on the concepts of Granger causality and Pearl causality; (3) point out some strengths and limitations of these approaches; and (4) illustrate such methods using a few real-world examples from climate science.
Lossless image compression using new biorthogonal waveletssipij
Even though a large number of wavelets exist, one needs new wavelets for their specific applications. One
of the basic wavelet categories is orthogonal wavelets. But it was hard to find orthogonal and symmetric
wavelets. Symmetricity is required for perfect reconstruction. Hence, a need for orthogonal and symmetric
arises. The solution was in the form of biorthogonal wavelets which preserves perfect reconstruction
condition. Though a number of biorthogonal wavelets are proposed in the literature, in this paper four new
biorthogonal wavelets are proposed which gives better compression performance. The new wavelets are
compared with traditional wavelets by using the design metrics Peak Signal to Noise Ratio (PSNR) and
Compression Ratio (CR). Set Partitioning in Hierarchical Trees (SPIHT) coding algorithm was utilized to
incorporate compression of images.
ON ANALYTICAL APPROACH FOR ANALYSIS OF DISSOLUTION OF A MEDICINAL PRODUCT IN ...pbij
In this paper we introduce a model of dissolution of a medicinal product in a organism with account of
changing of conditions. The model gives a possibility to estimate spatio-temporal distribution of
concentration of a medicinal product during dissolution. We also consider an analytical approach to
analyze the above dissolution. We consider a possibility to accelerate and decelerate of the above
dissolution.
Capture recapture estimation for elusive events with two listsAlexander Decker
The document presents a new capture-recapture model for estimating the size of elusive epidemiologic events using two lists or data sources. It compares three estimators - the proposed estimator N, the traditional Petersen estimator Ns, and another estimator Nq. Both the Akaike Information Criterion and Mean Absolute Deviation, based on simulation studies, showed that the proposed estimator N performs better by being more consistent and accurate compared to the other estimators, especially when recaptures/relistings are low. For elusive populations where recaptures tend to be low, the Petersen estimator breaks down. The proposed model uses a "coverage probability" that accounts for heterogeneity in capture probabilities across individuals
This document proposes a modified Mann-Whitney U test that adjusts for tied observations between two sampled populations. The modification develops a test statistic (W) that intrinsically accounts for ties in its expected value and variance. W is calculated based on the ranks assigned to observations from each population when pooled and ranked together. The null hypothesis is that the expected value of W is equal to 0, indicating the populations have equal medians. This can be tested using a chi-square distribution with the test statistic comparing W to its variance, which is also adjusted for ties. The method allows analysis when populations are on an ordinal scale and avoids problems with ties encountered in traditional methods.
Talk given at Kobayashi-Maskawa Institute, Nagoya University, Japan.Peter Coles
Cosmic Anomalies
Observational measurements of the temperature variation of the Cosmic Microwave Background across the celestial sphere made by the Wilkinson Microwave Anisotropy Probe (WMAP) and, more recently Planck, have played a major part in establishing the standard "concordance" cosmological model. However, extensive statistical analysis of these data have also revealed some tantalising anomalies whose interpretation within the standard framework is by no means clear. In this talk, I'll discuss the significance of the evidence for some aspects of this anomalous behaviour, offer some possible theoretical models, and suggest how future measurements may provide firmer conclusions.
Computing Bayesian posterior with empirical likelihood in population geneticsPierre Pudlo
This document discusses using empirical likelihood to approximate Bayesian posteriors for population genetics models with intractable likelihoods. It introduces empirical likelihood, which defines parameters in terms of moment constraints and profiles a likelihood function without specifying a full parametric model. For population genetics models, pairwise composite likelihoods based on allelic states can be used, defining moment constraints in terms of pairwise maximum likelihood estimates. Numerical examples show empirical likelihood can provide better approximations than incorrectly specified parametric models when the true model is complex.
The document discusses using information theory techniques to infer causal relationships between variables from time series data, even when the data is incomplete, noisy, and irregularly sampled. It presents a method called information transfer (IT) that can quantify the relative strength and directionality of coupling between variables. The method is demonstrated on simulated data and two geological case studies: 1) analyzing the influence of northern and southern hemisphere climate change on East Asian monsoon intensity over the last glacial cycle, and 2) characterizing relationships among four Phanerozoic seawater isotope records. The document concludes that while IT between proxies cannot prove direct causality, it shows promise as a data-driven approach to infer causal interactions from geological records.
SEMI-PARAMETRIC ESTIMATION OF Px,y({(x,y)/x >y}) FOR THE POWER FUNCTION DISTR...IJESM JOURNAL
The stress-strength model describes the life of a component which has a random strength X and is subjected to random stress Y, in the context of reliability. The component will function satisfactorily whenever X>Y and it fails at the instant the stress applied to it exceeds the strength. R=P(Y<X) is a measure of component reliability .In this paper, we obtain semi parametric estimators of the reliability under stress- strength model for the Power function distribution under complete and censored samples. We illustrate the performance of the estimators using a simulation study.
The document discusses various methods for constructing confidence intervals for estimating multinomial proportions. It aims to analyze the propensity for aberrations (i.e. unrealistic bounds like negative values) in the interval estimates across different classical and Bayesian methods. Specifically, it provides the mathematical conditions under which each method may produce aberrant interval limits, such as zero-width intervals or bounds exceeding 0 and 1, especially for small sample counts. The document also develops an R program to facilitate computational implementation of the various methods for applied analysis of multinomial data.
NEW SCENARIO FOR TRANSITION TO SLOW 3-D TURBULENCE PART I.SLOW 1-D TURBULENCE...ijrap
Analyticalnon-perturbative study of thethree-dimensional nonlinear stochastic partialdifferential equation
with additive thermal noise, analogous to thatproposed by V.N.Nikolaevskii [1]-[5] to describelongitudinal
seismic waves, ispresented. Theequation has a threshold of short-waveinstability and symmetry, providing
longwavedynamics.New mechanism of quantum chaos generating in nonlineardynamical systemswith
infinite number of degrees of freedom is proposed. The hypothesis is said,that physical turbulence could be
identifiedwith quantum chaos of considered type. It is shown that the additive thermal noise destabilizes
dramatically the ground state of theNikolaevskii system thus causing it to make a direct transition from a
spatially uniform to a turbulent state
NEW SCENARIO FOR TRANSITION TO SLOW 3-D TURBULENCE PART I.SLOW 1-D TURBULENCE...ijrap
This document analyzes the 3D and 1D stochastic Nikolaevskii systems, which model turbulence. It introduces the 3D and 1D stochastic partial differential equations that were obtained by adding small white noise to the original non-stochastic Nikolaevskii systems. The key findings are: 1) Numerical integration of the 1D system is actually solving the stochastic model due to inherent computational noise. 2) Even small noise can significantly impact turbulent modes and change system behavior dramatically. 3) Developed turbulence in physical systems may be characterized as quantum chaos driven by thermal fluctuations based on modeling of the stochastic systems.
This document summarizes research on tachyon inflation in the context of Dirac-Born-Infeld (DBI) and Randall-Sundrum type II (RSII) models. It introduces tachyon fields which arise in string theory, describes how they can drive inflation in various scenarios, and presents numerical solutions of the equations of motion. Key aspects covered include tachyon potentials, equations for tachyon inflation, extending the analysis to an anti-de Sitter braneworld model based on RSII, and calculating observational parameters like the scalar spectral index and tensor-to-scalar ratio from the solutions.
1) The document discusses using quantum probes to indirectly extract information about complex quantum systems like ultracold atomic gases, without directly measuring the system.
2) One method is to use an impurity atom as a qubit probe immersed in a 2D Bose-Einstein condensate. Interactions between the probe and gas induce decoherence on the probe that depends on properties of the gas like dimensionality and phase fluctuations, allowing characterization of the gas.
3) The non-Markovianity of the probe's dynamics, quantified by information flow between the probe and gas, can reveal information about the gas without directly measuring it. Positive information flow indicates non-Markovian dynamics and backflow of information
Optimization of Manufacturing of Circuits XOR to Decrease Their Dimensionsijrap
We analyzed possibility to increase density of elements of integrated circuits. We illustrate the possibility
by example of manufacturing of a circuit XOR. Framework the paper we consider a heterostructure which
includes into itself a substrate and an epitaxial layer with specific configuration. Several required areas of
the heterostructure have been doped by diffusion and/or ion implantation. After that we consider optimized
annealing of dopant and/or radiation defects.
PICO presentation at EGU 2014 about the use of measures from information theory to visualise uncertainty in kinematic structural models - and to estimate where additional data would help reduce uncertainties. Some nice counter-intuitive results ;-)
On New Root Finding Algorithms for Solving Nonlinear Transcendental EquationsAI Publications
In this paper, we present new iterative algorithms to find a root of the given nonlinear transcendental equations. In the proposed algorithms, we use nonlinear Taylor’s polynomial interpolation and a modified error correction term with a fixed-point concept. We also investigated for possible extension of the higher order iterative algorithms in single variable to higher dimension. Several numerical examples are presented to illustrate the proposed algorithms.
Partitions and entropies intel third draftAnna Movsheva
The document discusses statistical properties of the entropy function of a random partition. It introduces the concept of counting the number of partitions of a set X that have entropy less than or equal to some value x. This counting function is denoted Θ(p, x). The document hypothesizes that the normalized counting function θ(p, x) = Θ(p, x)/Θ(H(p, X)) can be approximated by a cumulative Gaussian distribution, with the mean and standard deviation of the distribution being functions of the probability distribution p. Evidence for this conjecture is provided by computer simulations.
This document is a physics problem set from MIT's 8.044 Statistical Physics I course in Spring 2004. It contains 5 problems related to statistical physics and probability distributions. Problem 1 considers the probability distribution and properties of the position of a particle undergoing simple harmonic motion. Problem 2 examines the probability distribution of the x-component of angular momentum for a quantum mechanical system. Problem 3 analyzes a mixed probability distribution describing the energy of an electron. Problem 4 involves finding and sketching the time-dependent probability distribution for the position of a particle given its wavefunction. Problem 5 concerns Bose-Einstein statistics and calculating properties of the distribution that describes the number of photons in a given mode.
This document summarizes key probability distributions: binomial, Poisson, and normal. The binomial distribution describes the number of successes in fixed number of trials where the probability of success is constant. The Poisson distribution approximates the binomial when the number of trials is large and the probability of success is small. The normal distribution describes many continuous random variables and is symmetric with two parameters: mean and standard deviation. The document also discusses when binomial and Poisson distributions can be approximated as normal distributions.
Knowledge of cause-effect relationships is central to the field of climate science, supporting mechanistic understanding, observational sampling strategies, experimental design, model development and model prediction. While the major causal connections in our planet's climate system are already known, there is still potential for new discoveries in some areas. The purpose of this talk is to make this community familiar with a variety of available tools to discover potential cause-effect relationships from observed or simulation data. Some of these tools are already in use in climate science, others are just emerging in recent years. None of them are miracle solutions, but many can provide important pieces of information to climate scientists. An important way to use such methods is to generate cause-effect hypotheses that climate experts can then study further. In this talk we will (1) introduce key concepts important for causal analysis; (2) discuss some methods based on the concepts of Granger causality and Pearl causality; (3) point out some strengths and limitations of these approaches; and (4) illustrate such methods using a few real-world examples from climate science.
Lossless image compression using new biorthogonal waveletssipij
Even though a large number of wavelets exist, one needs new wavelets for their specific applications. One
of the basic wavelet categories is orthogonal wavelets. But it was hard to find orthogonal and symmetric
wavelets. Symmetricity is required for perfect reconstruction. Hence, a need for orthogonal and symmetric
arises. The solution was in the form of biorthogonal wavelets which preserves perfect reconstruction
condition. Though a number of biorthogonal wavelets are proposed in the literature, in this paper four new
biorthogonal wavelets are proposed which gives better compression performance. The new wavelets are
compared with traditional wavelets by using the design metrics Peak Signal to Noise Ratio (PSNR) and
Compression Ratio (CR). Set Partitioning in Hierarchical Trees (SPIHT) coding algorithm was utilized to
incorporate compression of images.
ON ANALYTICAL APPROACH FOR ANALYSIS OF DISSOLUTION OF A MEDICINAL PRODUCT IN ...pbij
In this paper we introduce a model of dissolution of a medicinal product in a organism with account of
changing of conditions. The model gives a possibility to estimate spatio-temporal distribution of
concentration of a medicinal product during dissolution. We also consider an analytical approach to
analyze the above dissolution. We consider a possibility to accelerate and decelerate of the above
dissolution.
Similar to Entropy Nucleus a nd Use i n Waste Disposal Policies (20)
A review on techniques and modelling methodologies used for checking electrom...nooriasukmaningtyas
The proper function of the integrated circuit (IC) in an inhibiting electromagnetic environment has always been a serious concern throughout the decades of revolution in the world of electronics, from disjunct devices to today’s integrated circuit technology, where billions of transistors are combined on a single chip. The automotive industry and smart vehicles in particular, are confronting design issues such as being prone to electromagnetic interference (EMI). Electronic control devices calculate incorrect outputs because of EMI and sensors give misleading values which can prove fatal in case of automotives. In this paper, the authors have non exhaustively tried to review research work concerned with the investigation of EMI in ICs and prediction of this EMI using various modelling methodologies and measurement setups.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
bank management system in java and mysql report1.pdf
Entropy Nucleus a nd Use i n Waste Disposal Policies
1. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
DOI : 10.5121/ijit.2015.4201 1
Entropy Nucleus and Use in Waste Disposal
Policies
Ramalingam Shanmugam1
1
School of Health Administration, Texas State University, San Marcos, TX78666, USA
ABSTRACT
The central theme of this article is that the usual Shannon’s entropy [1] is not sufficient to address the
unknown Gaussian population average. A remedy is necessary. By peeling away entropy junkies, a refined
version is introduced and it is named nucleus entropy in this article. Statistical properties and advantages
of the Gaussian nucleus entropy are derived and utilized to interpret 2005 and 2007 waste disposals (in
1,000 tons) by fifty-one states (including the District of Columbia) in USA. Each state generates its own,
imports from a state for a revenue, and exports to another state with a payment waste disposal [2]. Nucleus
entropy is large when the population average is large and/or when the population variance is lesser.
Nucleus entropy advocates the significance of the waste policies under four scenarios: (1) keep only
generated, (2) keep generated with receiving in and shipping out, (3) without receiving in, and (4) without
shipping out. In the end, a few recommendations are suggested for the waste management policy makers.
KEYWORDS
Shannon’s entropy, p-value, statistical power, hypothesis testing
1. MOTIVATION TO REFINE SHANNON’S ENTROPY
What is an entropy? In essence, an entropy or its equivalent nomenclature information is a
knowledge basis, which helps to change one’s opinion. Fisher [3] initiated the very thought of
information in statistical contexts. In 1948, Shannon promoted logarithmic measure of the
information to quantify signals in communication disciplines. Shannon named his seminal idea to
measure information as entropy. Why did he do so? Entropy is an elusive but a useful concept.
An entropy should be quantifiable, partially orderable, additive, storable and transmittable.
Claude Shannon himself mentioned: “My greatest concern was what to call it. I thought of calling
it information, but the word was overly used, as I decided to call it uncertainty. When I discussed
it with John von Neumann, he had a better idea. Von told me; you should call it entropy for two
reasons. In the first place, your uncertainty function has been used in statistical mechanics under
that name, so it already has a name. In the second place, more important, nobody knows what
entropy really is, so in a debate you will always have the advantage” (Tribus and McIrvine [4]).
Does entropy refer uncertainty or dis (order)? Shannon interpreted data information as a positive
entropy. If so, it creates conflicts. The greater information ought to imply smaller entropy. The
information might be lost in a process of transmission while the entropy might increase. This
conflicting thought of entropy originated in quantum physics (Jaynes [5]). However, the entropy
concept is utilized in economics and statistics among other disciplines with a contextual
interpretation opposite to what Shannon intended (Lippman and McCall [6]). The entropy of a
continuous random variable (RV) may be negative and of a discrete RV may even be infinite.
Such controversies lead to give up on entropy as stated in Ben-Naim [7].
2. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
2
Shannon’s entropy possesses several useful properties but not the much-needed additive property,
which is a requirement in data analysis. When an additional observation becomes available, the
expected entropy ought to increase. The Shannon’s entropy does not do so and hence, it needs a
modernization. This article modernizes Shannon’s entropy by peeling away unnecessary entropy
junkies in it and names the new version nucleus entropy. In particular, the properties of the
Gaussian nucleus entropy are derived and illustrated using the log-transformation of the
generated, shipped, and received waste disposals (in 1,000 tons) among the fifty-one (including
the Washington, District of Columbia) states in USA in [2].
2. NUCLEUS ENTROPY DEFINED WITH PROPERTIES
What is Gaussian population? A brilliant French mathematician with the name Abraham de
Moivre [8] wrote a self-published seven pages description of a bell shaped curve. Only 71 years
later, a German astronomer Johann Carl Frederich Gauss [9] utilized de Movire’s idea to model
the errors between the actual and projected position of the celestial bodies of our universe. As a
misnomer, the bell shaped curve is recognized as Gaussian not de Movire’s population frequency
curve once a random sample is drawn from it.
Before discussing further, consider the Gaussian population frequency curve
2 2 2
2 ( 2 )/2 2 2
( , ) / 2 ; ; 0;y y
f y e y
(1)
where is an unknown natural parameter and 2
is a known shape parameter. The Shannon’s
entropy (.)H is then
2 2 2 21
( , ) ( , )ln ( , ) ln 2
2
GaussH y f y f y dy e . (2)
Notice that its natural parameter is not even a part of the entropy. Furthermore, the Shannon’s
entropy echoes a conflict. To see it, suppose that a random sample 1 2, ,.., ny y y is drawn from a
Gaussian population 2
( , )f y . It is known (Mood, Graybill, and Boes [10]) that a sum
1 2 ... ns y y y of n independent and identically normally distributed outcomes follows a
normal probability structure 2
( , )f s n n . In which case, the Shannon entropy of the sum
1 2 ... ns y y y ought to be n times 2
( , )GaussH y . But, it did not happen so. The
Shannon’s entropy of the sum is not additive of the individual entropies. That is,
2 2
( , ) ( , )Gauss GaussH s n n nH y . For entropy practitioners, this causes confusion, as the
Shannon’s entropy is not adding up as a new Gaussian observation becomes available. The
existence of such a deficiency in Shannon’s entropy is a sufficient reason to modernize the
entropy idea in an alternative way and it is what exactly done in this article.
In other words, this article introduces a new and novel approach based on Gaussian nucleus
entropy. That is,
Definition 1. A nucleus entropy,
2
,
Y
resides in the Gaussian population frequency curve once it
is written as
2
2 2 2 ,
( , ) ( , ) ( , ) Yf y A B y
, with an observation y , a natural parameter
3. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
3
and an entropy accumulator parameter 2
. In other words, the Gaussian nucleus entropy
is
2 2,
:
y
Gauss Y e
.
The function 2
( , )A is insulated from observation y . The function 2
( , )B y does not connect
to the unknown natural parameter . In a sense, both the functions (that is,
2
( , )A and 2
( , )B y ) are entropy junkies. Without losing any generality, the nucleus entropy
could be expressed in a logarithmic scale just for the sake of a comparison with the Shannon’s
entropy, which is in a logarithmic scale. Notice that the nucleus entropy involves both the
unknown natural parameter and an entropy accumulator parameter 2
. The Gaussian nucleus
entropy is more appropriate, appealing, and meaningful than the Shannon’s entropy. The expected
nucleus entropy is
2 2
, ,
: :{ln( )}Gauss Y f Gauss YS E
which simplifies to
2 2 2
2
, 2 2 , ,
: 2
[ ( , ) ( , ) ]lny
Gauss Y Y YS A B y
, (3)
The sample counterpart of (3) is named observable Gaussian nucleus entropy and it
is
2
,
2
y n
Gauss
y
O
. The nucleus entropy happens to be the squared inverse of the coefficient of
variation (CV). Mood et al. [10] for the definition of CV. The entropy increases with an increase
of the average and/or with a decrease of the variance. The sample mean y is the maximum
likelihood estimator (MLE) of the unknown natural parameter . The MLE is invariant (Mood et
al. [10]). That is, the MLE of a function of the parameter is simply the MLE of a function of the
parameter.
Practitioners would wonder: Is an observed Gaussian nucleus entropy statistically significant? An
answer depends on the outcome of testing the null hypothesis
2
,
0 ,: 0Gauss YH S
against an
alternative hypothesis
2
,
1 ,: 0Gauss YH S
. For this purpose, we proceed as follows. The statistic
,y n
GaussO asymptotically follows a normal distribution with an expected
value , ,
,( )y n
Gauss Gauss YE O S
and variance , , 2 2
var( ) [ ] /y n y n
Gauss y GaussO O n where the
notations y and 2
denote respectively the derivative with respect to y evaluated at and
the var( )Y . Note
2
ˆvar( )y
n
and ,
2
2ˆ y n
y Gauss
y
O
. Hence, the score is 2
2
Gauss
y n
Z
to test
whether the null hypothesis
2
,
:: 0o Gauss YH S
. Hence, the null hypothesis is rejected in favor of
the alternative hypothesis
2
,
1 ,: 0Gauss YH S
with a p-value.:
2
2Pr( )
2
y n
p value Z
(4)
where Z is the standardized Gaussian random variable.
The (statistical) power of accepting a given true specific alternate value
22
1,,
, ,Gauss Y Gauss YS S
is
4. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
4
/2 1
Pr( )
z
power Z
y
(5)
where 0 1 is a chosen significance level.
The above results are illustrated in the next section using logarithm of waste disposals (in 1,000
tons) generated, shipped out, and received in fifty-one (including Washington, D.C.) states of
USA in [2].
2.1. Illustration with Waste Disposals in Fifty-One US States
Archeologists (William and Murphy [11]) prove that even in 6,500 B. C., the North American
communities generated as much as 5.3 pounds of waste per day. In the current modern age of 21st
century with a high quality life (which requires a large amount of material consumptions), the
waste disposals are extremely large and become a challenge to environmentalists. The health
perils persist when the waste disposals are not properly processed. Otherwise, the trashes might
explode and/or contaminate the land, air, and water sources. The sanitation is ruined and it
causes known and unknown viruses and later on illnesses. The management of waste disposals
(including the health hazardous medical wastes) is known to be infectious and causing chronic
diseases (Reinhardt and Gordon [12]). WHO reports that in 2002 alone, about 25.9% of all 14.7
million deaths worldwide are due to infection and it could have been averted with a hygienic
living environment. A lesson is that the wastes must be properly collected, managed and disposed
to maintain a hygienic living environment. Otherwise, the residents and visitors might undergo a
health risk. Often, killer methane gas is released from the improperly maintained waste filled land
sites. Many health hazards like fire due to ignitability, bad smell due to corrosiveness,
radioactivity in surrounding water sources, toxicity etc. exist in such sites. Remedial actions
include recycling, neutralizing, incineration, destruction, and conversion to energy. Effective on
5th
May, 1992, the 172 countries worldwide started implementing the Basel Convention’s
agreement to practice waste management using technologies [13]. In USA, the fifty-one
(including Washington, D.C.) states practice shipping out and receiving in to their own generated
waste. This article investigates the policy scenarios (with respect to shipping out and/or receiving
in waste) using significant changes in the Gaussian nucleus entropies.
Figure 1. As it is Figure 2. No receiving Figure 3. No shipping Figure 4.Only generated
To be specific, the generated, gY , shipped out, sY and received in, rY waste disposals by the US
states as displayed in [2] for the years 2005 and 2007 are considered and analyzed. The amounts
ln( )g r sY Y Y , ln( )g sY Y , ln( )g rY Y , and ln( )gY respectively represent the waste under
current policy with shipping out and receiving, under a policy of cancelling receiving, under a
policy of stopping shipping out, and under a policy of just doing without shipping out and
receiving in, The amounts follow Gaussian frequency curve (because dots are closer to diagonal
line in P-P plots in Figures 1 through 4) with averages grs , gs , gr and g and variances 2
grs ,
2
gs , 2
gr and 2
g .
5. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
5
The observed Gaussian nucleus entropy , 2y n
GaussO
is calculated and displayed in Table 1 for the fifty-
one states along with their p-values, according to (4). When the p-value is smaller (0.05 or less),
the alternative
2
,
1 ,: 0Gauss YH S
is acceptable meaning that nucleus entropy about the natural
parameter is significant. We notice the following. The generated waste is negligible only in
states: District of Columbia, Idaho, and South Dakota, according to the nucleus entropy. The
nucleus entropy of the current practice of receiving in and shipping out along with the generated
waste in all the fifty-one states including District of Columbia, Idaho, and South Dakota is
significant validating the current practice. If receiving in the waste is discontinued, the nucleus
entropy becomes negligible in states: Alabama, Arkansas, Arizona, California, District of
Columbia, Hawaii, Idaho, Kentucky, Maine, Maryland, Michigan, Minnesota, Missouri,
Montana, Nebraska, New Hampshire, New Jersey, North Carolina, Pennsylvania, Utah, Virginia,
Washington, Wisconsin, and Wyoming meaning that these states could consider cancelling the
policy of receiving in the waste to their states. When the nucleus entropy remains negligible
under a cancellation of shipping out waste, then those states may not consider shipping out and
such states are: Delaware, Hawaii, Maine, and Vermont.
How sound is the methodology based on the nucleus entropy? To answer it, the statistical power
is calculated using the national average for 1 in
2
,
1 ,: 0Gauss YH S
, under each of all four scenarios,
according to (5). The minimum and maximum of the statistical powers across all four scenarios
for each state are slated in Table 1. No minimum is lesser than 0.21 and the maximum is mostly
0.99 implying that the methodology is powerful enough.
3. COMMENTS AND RECOMMENDATIONS
Four scenarios to deal with the waste disposals are evaluated using nucleus entropies. They are:
(1) keep only the generated waste, (2) receiving in waste in addition to generated waste without
sending out any, (3) sending out a part of generated waste without receiving in additional waste,
and (4) the current policy of receiving in and sending out waste in addition to generated waste. A
large entropy is indicative of significant waste. One scenario does not fit all fifty-one states,
according to the nucleus entropies. In an overall sense, some states perform similarly as a cluster
(Figure 5).
Figure 5. Clusters of states in the current practice of waste disposals
6. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
6
ACKNOWLEDGEMENTS
The author thanks Texas State University for a travel support to present this research work in the
International Conference on Recent Innovations in Engineering & Technology – 13-14 February
2015, Chinna Salem, Tamil Nadu, India.
REFERENCES
[1] C. E. A. Shannon (1948). Mathematical theory of communication. Bell Sys. Tech. J., 1948, 27, 323-
332; 379-423.
[2] www.epa.gov/epawaste/inforesources/data/biennialreport/index.htm
[3] R. A. Fisher (1925). Theory of Statistical Estimation. Proceedings of the Cambridge Philosophical
Society 22 (5): 700–725.
[4] M. Tribus and E. C. McIrvine (1971). Energy and information, Scientific American 224:179–186.
[5] E. T. Jaynes (1957). "Information theory and statistical mechanics" (PDF). Physical Review 106 (4):
620–630.
[6] S. S. Lippman and J.J. McCall (2001). "Information, Economics of", International Encyclopedia of
the Social & Behavioral Sciences, pp. 7480–7486.
[7] A. Ben-Naim (2011). A Farewell to Entropy: Statistical Thermodynamics Based on Information.
Singapore: World Scientific Press.
[8] A. de Moivre (1738). The Doctrine of Chances. ISBN 0-8218-2103-2.
[9] C. F. Gauss (1809). Theoria motvs corporvm coelestivm in sectionibvs conicis Solem ambientivm
[Theory of the Motion of the Heavenly Bodies Moving about the Sun in Conic Sections] (in Latin).
English translation.
[10] A. M. Mood, F. A. Graybill and D. C. Boes (1974). Introduction to the Theory of Statistics, New
York: McGraw Hill Press.
[11] R. William and C. Murphy (1992). Rubbish! The Archaeology of Garbage, New York, NY, Harper
Collin Publishers.
[12] P. A. Reinhardt, and J. G. Gordon (1991). Infectious and medical waste management. Chelsea,
Michigan, Lewis Publishers.
[13] http://www.basel.int/Countries/StatusofRatifications/BanAmendment/tabid/1344/Default.aspx.
TATBLE 1. Gauss nucleus entropy
, 2y n
GaussO
under four policies:
Waste generated Waste with
generating,
receiving in, and
sending out
Waste without
receiving in
Waste without
sending out
Power
Sta
tes
Nucleus
entropy
p-
value
Nucleus
entropy
P
value
Nucleus
entropy
P
value
Nucleus
entropy
P
value
Mini
mum
Maxi
mum
AL 149.33 0.001 784.6822 0.001 1.903 0.14 666.57 0.001 0.76 0.87
AK 307.27 0.001 27.6477 2E-04 0.241 0.4 27.439 0.001 0.99 0.99
AZ 36.354 2E-05 676.829 0.001 0.251 0.88 86.178 0.001 0.94 0.99
AR 6157.2 0.001 4846.406 0.001 876.6 0.001 1937.6 0.001 0.74 0.99
CA 2017.1 0.001 194.1282 0.001 0.549 0.98 161.1 0.001 0.69 0.99
CO 119.95 1E-14 6389.492 0.001 3.16 0.08 4227.3 0.001 0.94 0.99
8. International Journal on Information Theory (IJIT),Vol.4, No.2, April 2015
8
Author
Dr. Ram Shanmugam is currently a professor in the School of Health
Administration in Texas State University, San Marcos, USA. He obtained a Ph.
D, degree in applied statistics from the School of Business Administration,
Temple University, Philadelphia. He has publishe d more than 120 research
articles in refereed national and international journals. He is an elected fellow of
the prestigious International Statistical Institute. He is the
Editor-in-Chief of the international journals: [1] Advances in Life Sciences and
Health (ALSH).
In addition [2] Global Journal of Research and Review.