A multi-stage method for calibration of sunphotometer is proposed by combining comparison calibration method between two different wavelengths with aureole observation method for long wavelength calibration. Its effectiveness in reducing the influences for calibration due to molecular and aerosolfs extinction in the unstable turbidity conditions is clarified. By comparing the calculated results with the proposed method and the existing individually calibration method, it is found that the proposed method is superior to the existing method in terms of calibration accuracy. Namely, Through a comparison between ILM and the proposed method using band 0.87um as reference, the largest calibration errors are 0.0014, 0.0428 by PM are lower than that by ILM (0.011,0.0489) for sky radiances with no error and -3~+3%, -5~+5% errors. By analyzing the observation data of 15 days with POM-1 Skyradiometer, the largest standard deviation of calibration constants by PM is 0.02016, and is lower than that by ILM (0.03858).
Towards the identification of the primary particle nature by the radiodetecti...Ahmed Ammar Rebai PhD
Radio signal from extensive air showers EAS studied by the CODALEMA experiment have been detected by means of the classic short fat antennas array working in a slave trigger mode by a particle scintillator array. It is shown that the radio shower wavefront is curved with respect to the plane wavefront hypothesis. Then a new tting model (parabolic model) is proposed to fit the radio signal time delay distributions in an event-by-event basis. This model take
into account this wavefront property and several shower geometry parameters such as: the existence of an apparent localised radio-emission source located at a distance Rc from the antenna array of and the radio shower core on the
ground. Comparison of the outputs from this model and other reconstruction models used in the same experiment show:
1)- That the radio shower core is shifted from the particle shower core in a statistic analysis approach.
2)- The capability of the radiodetection method to reconstruct the curvature radius with a statistical error less than 50 g.cm−2 .
Finally a preliminary study of the primary particle nature has been performed based on a comparison between data and Xmax distribution from Aires Monte-Carlo simulations for the same set of events.
Explanation of very simple methods for atmospheric corrections and an example adapted from a paper of the Dept. of Thermodynamics, University of Valencia, Spain.
Towards the identification of the primary particle nature by the radiodetecti...Ahmed Ammar Rebai PhD
Radio signal from extensive air showers EAS studied by the CODALEMA experiment have been detected by means of the classic short fat antennas array working in a slave trigger mode by a particle scintillator array. It is shown that the radio shower wavefront is curved with respect to the plane wavefront hypothesis. Then a new tting model (parabolic model) is proposed to fit the radio signal time delay distributions in an event-by-event basis. This model take
into account this wavefront property and several shower geometry parameters such as: the existence of an apparent localised radio-emission source located at a distance Rc from the antenna array of and the radio shower core on the
ground. Comparison of the outputs from this model and other reconstruction models used in the same experiment show:
1)- That the radio shower core is shifted from the particle shower core in a statistic analysis approach.
2)- The capability of the radiodetection method to reconstruct the curvature radius with a statistical error less than 50 g.cm−2 .
Finally a preliminary study of the primary particle nature has been performed based on a comparison between data and Xmax distribution from Aires Monte-Carlo simulations for the same set of events.
Explanation of very simple methods for atmospheric corrections and an example adapted from a paper of the Dept. of Thermodynamics, University of Valencia, Spain.
Towards the identification of the primary particle nature by the radiodetecti...Ahmed Ammar Rebai PhD
To contact the author use ahmed.rebai2@gmail.com
Radio signal from extensive air showers EAS studied by the CODALEMA experiment have been detected by means of the classic short fat antennas array working in a slave trigger mode by a particle scintillator array. It is shown that the radio shower wavefront is curved with respect to the plane wavefront hypothesis. Then a new fitting model (parabolic model) is proposed to fit the radio signal time delay distributions in an event-by-event basis. This model take into account this wavefront property and several shower geometry parameters such as: the existence of an apparent localised radio-emission source located at a distance Rc from the antenna array of and the
radio shower core on the ground. Comparison of the outputs from this model and other reconstruction models used in the same experiment show: 1)- That the radio shower core is shifted from the particle shower core in a statistic analysis approach. 2)- The capability of the radiodetection method to reconstruct the curvature radius
with a statistical error less than 50 g.cm−2 . Finally a preliminary study of the primary particle nature has been performed based on a comparison between data and Xmax distribution from Aires Monte-Carlo simulations for the same set of events.
Retrieval & monitoring of atmospheric green house gases (gh gs) through remot...debasishagri
Climate change is one of the most important global environmental challenges of this century. Green House Gases (GHGs) are the main culprit for this problem. Though much of research has already been done about the distribution and sources (and sinks) of GHGs , still much more uncertainties are present. Currently, there are only a few satellite instruments in orbit which are able to measure atmospheric GHGs. The High Resolution Infrared Radiation Sounder (HIRS), the Atmospheric InfraRed Sounder (AIRS), and the Infrared Atmospheric Sounding Interferometer (IASI) perform measurements in the thermal infrared (TIR) spectral region. But these are having low sensitivity to lower troposphere. In contrast to this, the sensitivity of instruments measuring reflected solar radiation in the near-infrared (NIR)/shortwave infrared (SWIR) spectral region is much more constant (with height) and shows maximum values near the surface. At present, SCIAMACHY aboard ENVISAT launched in 2002 and TANSO (Thermal And Near infrared Sensor for carbon Observation) aboard GOSAT (Greenhouse gases Observing SATellite) launched in 2009 are the only orbiting instruments measuring in NIR region. Among all the algorithms the WFM-DOAS algorithm (Weighting Function Modified Differential Optical Absorption Spectroscopy) developed at the University of Bremen for the retrieval of trace gases from SCIAMACHY (Buchwitz et al.2005) is mostly used. This is based on the principle of differential detection of radiance in gaseous absorption channels with respect to neighboring atmospheric transparent spectral channels (not influenced by gas) to detect the conc. of desired gas. But scattering at aerosol and/or cloud particles remains a major source of uncertainty for SCIAMACHY XCO2 retrievals(Houweling 2005, Schneising 2008).Of late with the use of new merged fit window approach scientists have come up with less than 0.5 ppm error in the estimation of CO2 in the presence of thin cirrus cloud(Reuter, Buchwitz et. al. 2010). Schneising et. al.,2007,retrieved d three year’s column-averaged CO2 dry air mole fraction from the SCIAMACHY instrument using the retrieval algorithm WFM-DOAS version 1.0, with precision of about 2 ppm. In India a study was undertaken to compare the atmospheric methane concentration pattern from SCIAMACHY with the vegetation dynamics from SPOT, showed fairly good correlation of methane emission with the rice cultivation(Goroshi et. al.).
Multi-Resolution Analysis: MRA Based Bright Band Height Estimation with Preci...Waqas Tariq
A method for reconstruction of cross section of rainfall situations with precipitation radar data based on wavelet analysis of Multi-Resolution Analysis (MRA) which allows extract a peak of the radar reflectivity is proposed in order to detect bright band height. It is found that the bright band height can be estimated by using the MRA with the basis of Daubechies wavelet family. It is also found that the boundaries in rainfall structure can be clearly extracted with MRA.
ALTERNATE METHOD TO COMBINE MONITORED AND PREDICTED CONCENTRATIONS IN DISPERS...Sergio A. Guerra
The advent of the short term NAAQS has prompted us to reassess the level of conservatism commonly used in dispersion modeling analyses. An area of conservatism in NAAQS demonstrations relates to the combining of predicted (modeled) concentrations from AERMOD with observed (monitored) concentrations. Normally, some of the highest monitored observations are combined with AERMOD results yielding a very conservative total concentration. For example, in the case of the 1-hour NO2 NAAQS, the chances of the 98th percentile monitored concentration occurring at the same time as the meteorology to generate the 98th percentile ambient concentration is extremely low. Therefore, assuming that both of these happen at the same time is overly conservative.
This presentation includes justification for the use of a reasonable background concentration to combine with the AERMOD predicted concentration. The use of this method, if accepted by regulatory agencies, can help facilities demonstrate compliance in dispersion modeling analyses by assuming a more reasonable background concentration while, at the same time being protective of the NAAQS.
Temporal trends of spatial correlation within the PM10 time series of the Air...Florencia Parravicini
We analyse the temporal variations which can be observed within time series of variogram parameters (nugget, sill and range) of daily air quality data (PM10) over a ten years time frame.
Leadership Driven Innovation: The Role of the Engineer in Our FutureWaqas Tariq
The development of new technologies, particularly those identified as advanced and/or disruptive, rely on two individual but highly interrelated competencies, leadership and innovation. These two are the basis for the successful development of most of the major technologies in production, today. At their best they are also the genesis for most of the large commercial and industrial organizations currently operating in the global marketplace. More importantly, it is the state of health of these two competencies that determines the longevity and profitability of these organizations. This paper tracks a hypothetical progression from inception to integrated production for what can be idealized as a maturation process of a new company/technology. While it is directed to represent most any type of new technology it is particularly suited to advance technologies and those organizations dealing in products for accelerating markets, such as for the aerospace and advance communications arenas.
Towards the identification of the primary particle nature by the radiodetecti...Ahmed Ammar Rebai PhD
To contact the author use ahmed.rebai2@gmail.com
Radio signal from extensive air showers EAS studied by the CODALEMA experiment have been detected by means of the classic short fat antennas array working in a slave trigger mode by a particle scintillator array. It is shown that the radio shower wavefront is curved with respect to the plane wavefront hypothesis. Then a new fitting model (parabolic model) is proposed to fit the radio signal time delay distributions in an event-by-event basis. This model take into account this wavefront property and several shower geometry parameters such as: the existence of an apparent localised radio-emission source located at a distance Rc from the antenna array of and the
radio shower core on the ground. Comparison of the outputs from this model and other reconstruction models used in the same experiment show: 1)- That the radio shower core is shifted from the particle shower core in a statistic analysis approach. 2)- The capability of the radiodetection method to reconstruct the curvature radius
with a statistical error less than 50 g.cm−2 . Finally a preliminary study of the primary particle nature has been performed based on a comparison between data and Xmax distribution from Aires Monte-Carlo simulations for the same set of events.
Retrieval & monitoring of atmospheric green house gases (gh gs) through remot...debasishagri
Climate change is one of the most important global environmental challenges of this century. Green House Gases (GHGs) are the main culprit for this problem. Though much of research has already been done about the distribution and sources (and sinks) of GHGs , still much more uncertainties are present. Currently, there are only a few satellite instruments in orbit which are able to measure atmospheric GHGs. The High Resolution Infrared Radiation Sounder (HIRS), the Atmospheric InfraRed Sounder (AIRS), and the Infrared Atmospheric Sounding Interferometer (IASI) perform measurements in the thermal infrared (TIR) spectral region. But these are having low sensitivity to lower troposphere. In contrast to this, the sensitivity of instruments measuring reflected solar radiation in the near-infrared (NIR)/shortwave infrared (SWIR) spectral region is much more constant (with height) and shows maximum values near the surface. At present, SCIAMACHY aboard ENVISAT launched in 2002 and TANSO (Thermal And Near infrared Sensor for carbon Observation) aboard GOSAT (Greenhouse gases Observing SATellite) launched in 2009 are the only orbiting instruments measuring in NIR region. Among all the algorithms the WFM-DOAS algorithm (Weighting Function Modified Differential Optical Absorption Spectroscopy) developed at the University of Bremen for the retrieval of trace gases from SCIAMACHY (Buchwitz et al.2005) is mostly used. This is based on the principle of differential detection of radiance in gaseous absorption channels with respect to neighboring atmospheric transparent spectral channels (not influenced by gas) to detect the conc. of desired gas. But scattering at aerosol and/or cloud particles remains a major source of uncertainty for SCIAMACHY XCO2 retrievals(Houweling 2005, Schneising 2008).Of late with the use of new merged fit window approach scientists have come up with less than 0.5 ppm error in the estimation of CO2 in the presence of thin cirrus cloud(Reuter, Buchwitz et. al. 2010). Schneising et. al.,2007,retrieved d three year’s column-averaged CO2 dry air mole fraction from the SCIAMACHY instrument using the retrieval algorithm WFM-DOAS version 1.0, with precision of about 2 ppm. In India a study was undertaken to compare the atmospheric methane concentration pattern from SCIAMACHY with the vegetation dynamics from SPOT, showed fairly good correlation of methane emission with the rice cultivation(Goroshi et. al.).
Multi-Resolution Analysis: MRA Based Bright Band Height Estimation with Preci...Waqas Tariq
A method for reconstruction of cross section of rainfall situations with precipitation radar data based on wavelet analysis of Multi-Resolution Analysis (MRA) which allows extract a peak of the radar reflectivity is proposed in order to detect bright band height. It is found that the bright band height can be estimated by using the MRA with the basis of Daubechies wavelet family. It is also found that the boundaries in rainfall structure can be clearly extracted with MRA.
ALTERNATE METHOD TO COMBINE MONITORED AND PREDICTED CONCENTRATIONS IN DISPERS...Sergio A. Guerra
The advent of the short term NAAQS has prompted us to reassess the level of conservatism commonly used in dispersion modeling analyses. An area of conservatism in NAAQS demonstrations relates to the combining of predicted (modeled) concentrations from AERMOD with observed (monitored) concentrations. Normally, some of the highest monitored observations are combined with AERMOD results yielding a very conservative total concentration. For example, in the case of the 1-hour NO2 NAAQS, the chances of the 98th percentile monitored concentration occurring at the same time as the meteorology to generate the 98th percentile ambient concentration is extremely low. Therefore, assuming that both of these happen at the same time is overly conservative.
This presentation includes justification for the use of a reasonable background concentration to combine with the AERMOD predicted concentration. The use of this method, if accepted by regulatory agencies, can help facilities demonstrate compliance in dispersion modeling analyses by assuming a more reasonable background concentration while, at the same time being protective of the NAAQS.
Temporal trends of spatial correlation within the PM10 time series of the Air...Florencia Parravicini
We analyse the temporal variations which can be observed within time series of variogram parameters (nugget, sill and range) of daily air quality data (PM10) over a ten years time frame.
Leadership Driven Innovation: The Role of the Engineer in Our FutureWaqas Tariq
The development of new technologies, particularly those identified as advanced and/or disruptive, rely on two individual but highly interrelated competencies, leadership and innovation. These two are the basis for the successful development of most of the major technologies in production, today. At their best they are also the genesis for most of the large commercial and industrial organizations currently operating in the global marketplace. More importantly, it is the state of health of these two competencies that determines the longevity and profitability of these organizations. This paper tracks a hypothetical progression from inception to integrated production for what can be idealized as a maturation process of a new company/technology. While it is directed to represent most any type of new technology it is particularly suited to advance technologies and those organizations dealing in products for accelerating markets, such as for the aerospace and advance communications arenas.
2014 Top 10 Financial Industry Trends in ChinaKapronasia
2013 will remembered as an incredibly dynamic year for China’s financial services industry. From the increasing number of hedge funds in the market to the emergence and regulation of Bitcoin, industry observers, investors, participants and regulators have had their work cut out for them keeping up with the market.
Many regulations were implemented in 2013, but will really only start to show their impact in 2014. In both banking and capital markets, these changes will be far reaching in both the breadth and depth of how they affect markets.
To help clients and industry participants better understand what lies ahead for financial technology in China, Kapronasia is proud to announce our 4th annual top-10 financial technology trends webinar.
The webinar is complimentary and open to both clients and non-clients of Kapronasia. During the one hour webinar we will cover the key banking and capital markets trends that we expect to shape the market in 2014. Some of these trends include:
The continuing development of hedge funds
The expansion and diversification of the Shanghai Free Trade Zone
The future of Bitcoin in a increasingly bleak regulatory environment
Interest rate reform and how changes will affect how banks need to address the market
Atmospheric Correction of Remote Sensing Data_RamaRao.pptxssusercd49c0
Atmospheric correction of remote sensing data. This PPT describes development of a region sensitive atmospheric correction method for hyperspectral image processing
Research on Space Target Recognition Algorithm Based on Empirical Mode Decomp...Nooria Sukmaningtyas
The space target recognition algorithm, which is based on the time series of radar cross section
(RCS), is proposed in this paper to solve the problems of space target recognition in the active radar
system. In the algorithm, EMD method is applied for the first time to extract the eigen of RCS time series.
The normalized instantaneous frequencies of high-frequency intrinsic mode functions obtained by EMD are
used as the eigen values for the recognition, and an effective target recognition criterion is established.
The effectiveness and the stability of the algorithm are verified by both simulation data and real data. In
addition, the algorithm could reduce the estimation bias of RCS caused by inaccurate evaluation, and it is
of great significance in promoting the target recognition ability of narrow-band radar in practice.
The dosimetry was carried out for radiotherapy patients, and measurements were performed using LiF and thermoluminescent dosimeters (TLDs). Evaluations were done for water-equivalent (effective) thicknesses and target dose with transmission data. Considerations were made for the accuracy of the parameter for the ratio of measured to expected value for each quantity. The entrance dose was estimated as 1.01 ± 0.07. The mean ratio of effective to contour depth was 1.00 ± 0.13, showing a wide distribution reflecting the influence of contour inaccuracies. The mean ratio of the measured contour dose prescription was 1.00 ± 0.07. The difference in depths that is patient and effective depth is a reflection of target dose discrepancies. Graphical simulations were done using Monte-Carlo Simulations and presented.
The objective is to analyze and propose a methodology to manage with the attenuating effect promoted by carbon dioxide - CO2 on the performance of ultrasonic flow meter in gas flaring applications. Such methodology is based on experiments performed in a wind tunnel with a Reynolds number about 10^4 and concentration of CO2 above 60%. The results indicate that the ultrasonic meter exhibited measurement readings failures, especially in stages of abrupt changes in gas concentration, whose contents were above 5%. It is verified, as well, that the approximation of ultrasonic transducers tends to reduce such measurement failures.
Modelling Optical Waveguide Bends by the Method of LinesTELKOMNIKA JOURNAL
A rigorous analytical and semi analytical method of lines has been used to calculate the
transverse-electric field attenuation coefficient of guided mode as it travels in waveguide bends structure.
Both approaches then were compared to get a better understanding on how the attenuation behaves along
single curve waveguides with constant radius of curvature. The Helmholtz Equation in polar coordinate
was transformed into a curvalinier coordinate to simulate the waveguide bends using the method of line
analysis. The simple absorption boundary conditions are used into the method of lines to demonstrate
evanescent field of the guided mode nature as its travels in waveguide bends structures. The results show
that a reasonable agreement between both theoretical approaches.
Use of mesoscale modeling to increase the reliability of wind resource assess...Jean-Claude Meteodyn
During wind farm design phase, the wind direction distribution is a crucial information for wind turbine layout optimization. However, in complex terrains, the wind rose at hub height of the wind turbines can be quite different from met mast measurement.The study shows that in complex terrains, the use of mesoscale modeling provides a complement to met mast measurement. It allows to better determine the turbine-specific wind rose and to reduce the uncertainty in wind resource assessment. The coupling of mesoscale and CFD model allows to produce high resolution wind map, by taking into account both mesoscale and microscale terrain effects.
Some possible interpretations from data of the CODALEMA experimentAhmed Ammar Rebai PhD
The purpose of the CODALEMA experiment, installed at the Nan\c{c}ay Radio Observatory (France), is to study the radio-detection of ultra-high energy cosmic rays in the energy range of 10^{16}-10^{18} eV. Distributed over an area of 0.25 km^2, the original device uses in coincidence an array of particle detectors and an array of short antennas, with a centralized acquisition. A new analysis of the observable in energy for radio is presented from this system, taking into account the geomagnetic effect. Since 2011, a new array of radio-detectors, consisting of 60 stand-alone and self-triggered stations, is being deployed over an area of 1.5 km^2 around the initial configuration. This new development leads to specific constraints to be discussed in term of recognition of cosmic rays and in term of analysis of wave-front.
This deals with the assessment of several parameterizations of longwave radiation. The parametes were calibrated using a calibration tool on Ameriflux data.
Study of Rotation Measurements with a Passive Resonant Gyroscope Based on Hol...ijtsrd
We present rotation measurements performed with a passive resonant optical fibre gyroscope. The fibre used to realize the resonant cavity was a Kagome Hollow Core Fiber. Measurements were performed in two types of configurations. In the first configuration, the counter propagating beams resonate on the same cavity mode. This leads to observation of lock in that prevents measuring rotation rates below roughly 1° s. In the second configuration, the counter propagating beams resonate on two different cavity modes. This leads to rotation rates measurements below 0.1° s that are currently limited by some drifts. We will describe in the conference possible sources of drifts and the solutions we consider to circumvent them. Dr. Vinay Kumar Chaudhary "Study of Rotation Measurements with a Passive Resonant Gyroscope Based on Hollow Core Fiber" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26390.pdfPaper URL: https://www.ijtsrd.com/physics/other/26390/study-of-rotation-measurements-with-a-passive-resonant-gyroscope-based-on-hollow-core-fiber/dr-vinay-kumar-chaudhary
The Use of Java Swing’s Components to Develop a WidgetWaqas Tariq
Widget is a kind of application provides a single service such as a map, news feed, simple clock, battery-life indicators, etc. This kind of interactive software object has been developed to facilitate user interface (UI) design. A user interface (UI) function may be implemented using different widgets with the same function. In this article, we present the widget as a platform that is generally used in various applications, such as in desktop, web browser, and mobile phone. We also describe a visual menu of Java Swing’s components that will be used to establish widget. It will assume that we have successfully compiled and run a program that uses Swing components.
3D Human Hand Posture Reconstruction Using a Single 2D ImageWaqas Tariq
Passive sensing of the 3D geometric posture of the human hand has been studied extensively over the past decade. However, these research efforts have been hampered by the computational complexity caused by inverse kinematics and 3D reconstruction. In this paper, our objective focuses on 3D hand posture estimation based on a single 2D image with aim of robotic applications. We introduce the human hand model with 27 degrees of freedom (DOFs) and analyze some of its constraints to reduce the DOFs without any significant degradation of performance. A novel algorithm to estimate the 3D hand posture from eight 2D projected feature points is proposed. Experimental results using real images confirm that our algorithm gives good estimates of the 3D hand pose. Keywords: 3D hand posture estimation; Model-based approach; Gesture recognition; human- computer interface; machine vision.
Camera as Mouse and Keyboard for Handicap Person with Troubleshooting Ability...Waqas Tariq
Camera mouse has been widely used for handicap person to interact with computer. The utmost important of the use of camera mouse is must be able to replace all roles of typical mouse and keyboard. It must be able to provide all mouse click events and keyboard functions (include all shortcut keys) when it is used by handicap person. Also, the use of camera mouse must allow users troubleshooting by themselves. Moreover, it must be able to eliminate neck fatigue effect when it is used during long period. In this paper, we propose camera mouse system with timer as left click event and blinking as right click event. Also, we modify original screen keyboard layout by add two additional buttons (button “drag/ drop” is used to do drag and drop of mouse events and another button is used to call task manager (for troubleshooting)) and change behavior of CTRL, ALT, SHIFT, and CAPS LOCK keys in order to provide shortcut keys of keyboard. Also, we develop recovery method which allows users go from camera and then come back again in order to eliminate neck fatigue effect. The experiments which involve several users have been done in our laboratory. The results show that the use of our camera mouse able to allow users do typing, left and right click events, drag and drop events, and troubleshooting without hand. By implement this system, handicap person can use computer more comfortable and reduce the dryness of eyes.
A Proposed Web Accessibility Framework for the Arab DisabledWaqas Tariq
The Web is providing unprecedented access to information and interaction for people with disabilities. This paper presents a Web accessibility framework which offers the ease of the Web accessing for the disabled Arab users and facilitates their lifelong learning as well. The proposed framework system provides the disabled Arab user with an easy means of access using their mother language so they don’t have to overcome the barrier of learning the target-spoken language. This framework is based on analyzing the web page meta-language, extracting its content and reformulating it in a suitable format for the disabled users. The basic objective of this framework is supporting the equal rights of the Arab disabled people for their access to the education and training with non disabled people. Key Words : Arabic Moon code, Arabic Sign Language, Deaf, Deaf-blind, E-learning Interactivity, Moon code, Web accessibility , Web framework , Web System, WWW.
Real Time Blinking Detection Based on Gabor FilterWaqas Tariq
New method of blinking detection is proposed. The utmost important of blinking detections method is robust against different users, noise, and also change of eye shape. In this paper, we propose blinking detections method by measuring of distance between two arcs of eye (upper part and lower part). We detect eye arcs by apply Gabor filter onto eye image. As we know that Gabor filter has advantage on image processing application since it able to extract spatial localized spectral features, such line, arch, and other shape are more easily detected. After two of eye arcs are detected, we measure the distance between both by using connected labeling method. The open eye is marked by the distance between two arcs is more than threshold and otherwise, the closed eye is marked by the distance less than threshold. The experiment result shows that our proposed method robust enough against different users, noise, and eye shape changes with perfectly accuracy.
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...Waqas Tariq
A method for computer input with human eyes-only using two Purkinje images which works in a real time basis without calibration is proposed. Experimental results shows that cornea curvature can be estimated by using two light sources derived Purkinje images so that no calibration for reducing person-to-person difference of cornea curvature. It is found that the proposed system allows usersf movements of 30 degrees in roll direction and 15 degrees in pitch direction utilizing detected face attitude which is derived from the face plane consisting three feature points on the face, two eyes and nose or mouth. Also it is found that the proposed system does work in a real time basis.
Toward a More Robust Usability concept with Perceived Enjoyment in the contex...Waqas Tariq
Mobile multimedia service is relatively new but has quickly dominated people¡¯s lives, especially among young people. To explain this popularity, this study applies and modifies the Technology Acceptance Model (TAM) to propose a research model and conduct an empirical study. The goal of study is to examine the role of Perceived Enjoyment (PE) and what determinants can contribute to PE in the context of using mobile multimedia service. The result indicates that PE is influencing on Perceived Usefulness (PU) and Perceived Ease of Use (PEOU) and directly Behavior Intention (BI). Aesthetics and flow are key determinants to explain Perceived Enjoyment (PE) in mobile multimedia usage.
Collaborative Learning of Organisational KnolwedgeWaqas Tariq
This paper presents recent research into methods used in Australian Indigenous Knowledge sharing and looks at how these can support the creation of suitable collaborative envi- ronments for timely organisational learning. The protocols and practices as used today and in the past by Indigenous communities are presented and discussed in relation to their relevance to a personalised system of knowledge sharing in modern organisational cultures. This research focuses on user models, knowledge acquisition and integration of data for constructivist learning in a networked repository of or- ganisational knowledge. The data collected in the repository is searched to provide collections of up-to-date and relevant material for training in a work environment. The aim is to improve knowledge collection and sharing in a team envi- ronment. This knowledge can then be collated into a story or workflow that represents the present knowledge in the organisation.
Our research aims to propose a global approach for specification, design and verification of context awareness Human Computer Interface (HCI). This is a Model Based Design approach (MBD). This methodology describes the ubiquitous environment by ontologies. OWL is the standard used for this purpose. The specification and modeling of Human-Computer Interaction are based on Petri nets (PN). This raises the question of representation of Petri nets with XML. We use for this purpose, the standard of modeling PNML. In this paper, we propose an extension of this standard for specification, generation and verification of HCI. This extension is a methodological approach for the construction of PNML with Petri nets. The design principle uses the concept of composition of elementary structures of Petri nets as PNML Modular. The objective is to obtain a valid interface through verification of properties of elementary Petri nets represented with PNML.
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardWaqas Tariq
The main aim of this paper is to build a system that is capable of detecting and recognizing the hand gesture in an image captured by using a camera. The system is built based on Altera’s FPGA DE2 board, which contains a Nios II soft core processor. Image processing techniques and a simple but effective algorithm are implemented to achieve this purpose. Image processing techniques are used to smooth the image in order to ease the subsequent processes in translating the hand sign signal. The algorithm is built for translating the numerical hand sign signal and the result are displayed on the seven segment display. Altera’s Quartus II, SOPC Builder and Nios II EDS software are used to construct the system. By using SOPC Builder, the related components on the DE2 board can be interconnected easily and orderly compared to traditional method that requires lengthy source code and time consuming. Quartus II is used to compile and download the design to the DE2 board. Then, under Nios II EDS, C programming language is used to code the hand sign translation algorithm. Being able to recognize the hand sign signal from images can helps human in controlling a robot and other applications which require only a simple set of instructions provided a CMOS sensor is included in the system.
An overview on Advanced Research Works on Brain-Computer InterfaceWaqas Tariq
A brain–computer interface (BCI) is a proficient result in the research field of human- computer synergy, where direct articulation between brain and an external device occurs resulting in augmenting, assisting and repairing human cognitive. Advanced works like generating brain-computer interface switch technologies for intermittent (or asynchronous) control in natural environments or developing brain-computer interface by Fuzzy logic Systems or by implementing wavelet theory to drive its efficacies are still going on and some useful results has also been found out. The requirements to develop this brain machine interface is also growing day by day i.e. like neuropsychological rehabilitation, emotion control, etc. An overview on the control theory and some advanced works on the field of brain machine interface are shown in this paper.
Exploring the Relationship Between Mobile Phone and Senior Citizens: A Malays...Waqas Tariq
There is growing ageing phenomena with the rise of ageing population throughout the world. According to the World Health Organization (2002), the growing ageing population indicates 694 million, or 223% is expected for people aged 60 and over, since 1970 and 2025.The growth is especially significant in some advanced countries such as North America, Japan, Italy, Germany, United Kingdom and so forth. This growing older adult population has significantly impact the social-culture, lifestyle, healthcare system, economy, infrastructure and government policy of a nation. However, there are limited research studies on the perception and usage of a mobile phone and its service for senior citizens in a developing nation like Malaysia. This paper explores the relationship between mobile phones and senior citizens in Malaysia from the perspective of a developing country. We conducted an exploratory study using contextual interviews with 5 senior citizens of how they perceive their mobile phones. This paper reveals 4 interesting themes from this preliminary study, in addition to the findings of the desirable mobile requirements for local senior citizens with respect of health, safety and communication purposes. The findings of this study bring interesting insight to local telecommunication industries as a whole, and will also serve as groundwork for more in-depth study in the future.
Principles of Good Screen Design in WebsitesWaqas Tariq
Visual techniques for proper arrangement of the elements on the user screen have helped the designers to make the screen look good and attractive. Several visual techniques emphasize the arrangement and ordering of the screen elements based on particular criteria for best appearance of the screen. This paper investigates few significant visual techniques in various web user interfaces and showcases the results for better understanding and their presence.
Virtual teams are used more and more by companies and other organizations to receive benefits. They are a great way to enable teamwork in situations where people are not sitting in the same physical place at the same time. As companies seek to increase the use of virtual teams, a need exists to explore the context of these teams, the virtuality of a team and software that may help these teams working virtualy. Virtual teams have the same basic principles as traditional teams, but there is one big difference. This difference is the way the team members communicate. Instead of using the dynamics of in-office face-to-face exchange, they now rely on special communication channels enabled by modern technologies, such as e-mails, faxes, phone calls and teleconferences, virtual meetings etc. This is why this paper is focused on the issues regarding virtual teams, and how these teams are created and progressing in Albania.
Cognitive Approach Towards the Maintenance of Web-Sites Through Quality Evalu...Waqas Tariq
It is a well established fact that the Web-Applications require frequent maintenance because of cutting– edge business competitions. The authors have worked on quality evaluation of web-site of Indian ecommerce domain. As a result of that work they have made a quality-wise ranking of these sites. According to their work and also the survey done by various other groups Futurebazaar web-site is considered to be one of the best Indian e-shopping sites. In this research paper the authors are assessing the maintenance of the same site by incorporating the problems incurred during this evaluation. This exercise gives a real world maintainability problem of web-sites. This work will give a clear picture of all the quality metrics which are directly or indirectly related with the maintainability of the web-site.
USEFul: A Framework to Mainstream Web Site Usability through Automated Evalua...Waqas Tariq
A paradox has been observed whereby web site usability is proven to be an essential element in a web site, yet at the same time there exist an abundance of web pages with poor usability. This discrepancy is the result of limitations that are currently preventing web developers in the commercial sector from producing usable web sites. In this paper we propose a framework whose objective is to alleviate this problem by automating certain aspects of the usability evaluation process. Mainstreaming comes as a result of automation, therefore enabling a non-expert in the field of usability to conduct the evaluation. This results in reducing the costs associated with such evaluation. Additionally, the framework allows the flexibility of adding, modifying or deleting guidelines without altering the code that references them since the guidelines and the code are two separate components. A comparison of the evaluation results carried out using the framework against published evaluations of web sites carried out by web site usability professionals reveals that the framework is able to automatically identify the majority of usability violations. Due to the consistency with which it evaluates, it identified additional guideline-related violations that were not identified by the human evaluators.
Robot Arm Utilized Having Meal Support System Based on Computer Input by Huma...Waqas Tariq
A robot arm utilized having meal support system based on computer input by human eyes only is proposed. The proposed system is developed for handicap/disabled persons as well as elderly persons and tested with able persons with several shapes and size of eyes under a variety of illumination conditions. The test results with normal persons show the proposed system does work well for selection of the desired foods and for retrieve the foods as appropriate as usersf requirements. It is found that the proposed system is 21% much faster than the manually controlled robotics.
Dynamic Construction of Telugu Speech Corpus for Voice Enabled Text EditorWaqas Tariq
In recent decades speech interactive systems have gained increasing importance. Performance of an ASR system mainly depends on the availability of large corpus of speech. The conventional method of building a large vocabulary speech recognizer for any language uses a top-down approach to speech. This approach requires large speech corpus with sentence or phoneme level transcription of the speech utterances. The transcriptions must also include different speech order so that the recognizer can build models for all the sounds present. But, for Telugu language, because of its complex nature, a very large, well annotated speech database is very difficult to build. It is very difficult, if not impossible, to cover all the words of any Indian language, where each word may have thousands and millions of word forms. A significant part of grammar that is handled by syntax in English (and other similar languages) is handled within morphology in Telugu. Phrases including several words (that is, tokens) in English would be mapped on to a single word in Telugu.Telugu language is phonetic in nature in addition to rich in morphology. That is why the speech technology developed for English cannot be applied to Telugu language. This paper highlights the work carried out in an attempt to build a voice enabled text editor with capability of automatic term suggestion. Main claim of the paper is the recognition enhancement process developed by us for suitability of highly inflecting, rich morphological languages. This method results in increased speech recognition accuracy with very much reduction in corpus size. It also adapts Telugu words to the database dynamically, resulting in growth of the corpus.
An Improved Approach for Word Ambiguity RemovalWaqas Tariq
Word ambiguity removal is a task of removing ambiguity from a word, i.e. correct sense of word is identified from ambiguous sentences. This paper describes a model that uses Part of Speech tagger and three categories for word sense disambiguation (WSD). Human Computer Interaction is very needful to improve interactions between users and computers. For this, the Supervised and Unsupervised methods are combined. The WSD algorithm is used to find the efficient and accurate sense of a word based on domain information. The accuracy of this work is evaluated with the aim of finding best suitable domain of word. Keywords: Human Computer Interaction, Supervised Training, Unsupervised Learning, Word Ambiguity, Word sense disambiguation
Parameters Optimization for Improving ASR Performance in Adverse Real World N...Waqas Tariq
From the existing research it has been observed that many techniques and methodologies are available for performing every step of Automatic Speech Recognition (ASR) system, but the performance (Minimization of Word Error Recognition-WER and Maximization of Word Accuracy Rate- WAR) of the methodology is not dependent on the only technique applied in that method. The research work indicates that, performance mainly depends on the category of the noise, the level of the noise and the variable size of the window, frame, frame overlap etc is considered in the existing methods. The main aim of the work presented in this paper is to use variable size of parameters like window size, frame size and frame overlap percentage to observe the performance of algorithms for various categories of noise with different levels and also train the system for all size of parameters and category of real world noisy environment to improve the performance of the speech recognition system. This paper presents the results of Signal-to-Noise Ratio (SNR) and Accuracy test by applying variable size of parameters. It is observed that, it is really very hard to evaluate test results and decide parameter size for ASR performance improvement for its resultant optimization. Hence, this study further suggests the feasible and optimum parameter size using Fuzzy Inference System (FIS) for enhancing resultant accuracy in adverse real world noisy environmental conditions. This work will be helpful to give discriminative training of ubiquitous ASR system for better Human Computer Interaction (HCI). Keywords: ASR Performance, ASR Parameters Optimization, Multi-Environmental Training, Fuzzy Inference System for ASR, ubiquitous ASR system, Human Computer Interaction (HCI)
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
Comparative Calibration Method Between two Different Wavelengths With Aureole Observations at Relatively Long Wavelength
1. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 93
Comparative Calibration Method Between Two Different
Wavelengths With Aureole Observations at Relatively Long
Wavelength
Kohei Arai arai@is.saga-u.ac.jp
Information Science Department
Saga University
Saga City, 840-8502, Japan
Xing Ming Liang xingming.liang@noaa.gov
Center for Satellite Application and Research (STAR),
NOAA/NESDIS,
Camp Springs, MD 20746, U.S.A.
Abstract
A multi-stage method for calibration of sunphotometer is proposed by combining comparison
calibration method between two different wavelengths with aureole observation method for long
wavelength calibration. Its effectiveness in reducing the influences for calibration due to molecular
and aerosol’s extinction in the unstable turbidity conditions is clarified. By comparing the
calculated results with the proposed method and the existing individually calibration method, it is
found that the proposed method is superior to the existing method in terms of calibration
accuracy. Namely, Through a comparison between ILM and the proposed method using band
0.87um as reference, the largest calibration errors are 0.0014, 0.0428 by PM are lower than that
by ILM (0.011,0.0489) for sky radiances with no error and -3~+3%, -5~+5% errors. By analyzing
the observation data of 15 days with POM-1 Skyradiometer, the largest standard deviation of
calibration constants by PM is 0.02016, and is lower than that by ILM (0.03858).
Keywords: Sunphotometer, Calibration, Langley Method, Modified Langley Method, Aureole,
Solar Direct Irradiance, Solar Diffuse Irradiance.
1. INTRODUCTION
Sunphotometer have been applied widely to measure aerosol optical properties for analyzing
local and global climate, such as Aerosol Robotic Network (AERONET)
1
. There are about 500
institutions of ground based aerosol monitoring by sunphotometers or skyradiometers in the
AERONET. Thus, the maintenance of the calibration constants of sunphotometers is essential in
such works, especially for monitoring of long-term variations of atmospheric turbidity [1]-[9].
It is well known that the common Langley method (CLM) is inability to assure to obtain accurate
calibration constant for sunphotometer due to the influence by unstable atmospheric extinction
[10]-[13]. In the CLM, the calibration constant is obtained by extrapolation of the plot of the
logarithm of the sunphotometer reading against atmospheric air mass to air mass 0. Large error
of calibration constant, however, it will occur as the optical depth of atmosphere changes during
the calibration period because of the unstable atmospheric turbidity. For this reason, many
previous works which focus on reducing the influence due to the unstable atmospheric turbidity in
the instrument calibration have been studied. One of the typical representatives is Improved
Langley method proposed by T. Nakajima [13]. They introduced an analysis of volume spectrum
to firstly estimate the aerosol optical depth in order to avoid the error due to the change of aerosol
optical depth in accordance with the unsteady turbidity conditions. In consequence, it made a
greater improvement for sunphotometer calibration than the common Langley method.
1
aeronet.gsfc.nasa.gov/Operational/pictures/.../Cimel_set_up.PDF
2. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 94
Some factors, however, such as observation errors in circumsolar radiances, the scattering of
atmospheric molecular, the estimate errors of volume spectrum and the other assumptions of
atmospheric conditions, result in insufficiency to reduce the contribution of multiple scattering by
analysis of volume spectrum. Sometime estimation errors of aerosol optical depth are also
significant. Thus the estimation accuracy of the calibration constants also becomes small by ILM,
especially for the short wavelength. On the other hand, ratio Langley method (RLM), proposed by
B.W. Forgan (1994) [12], is the method which is depend on a known calibration for a reference
wavelength to permit calibration at the others. Using this method, it is possible to improve
calibration accuracies by selecting the long wavelength with being calibrated well by ILM as a
reference to perform calibration at the others.
In the following section, a multi-stage calibration method by combining ILM with RLM to perform
calibration for sunphotometer is proposed. Results from a numerical simulation and an analysis
for the actual data measurement by skyradiometer are followed by in order to validate the
proposed method. Then conclusions and some discussions are followed.
2. ANALYSIS OF VOLUME SPECTRUM AND IMPROVED LANGLEY
METHOD
The CLM is based on the Beer-Lambert law as follows,
τmFF −= 0lnln (1)
where F and F0 are solar downward irradiances at surface and extra-atmosphere, respectively. τ
is total optical depth of atmosphere. m is atmospheric air-mass, is approximately equal to
)cos(/1 0θ as θ0 (solar zenith angle) is less than 75 . Invariance of the aerosol optical depth in
accordance with stable atmospheric condition at different solar zenith angles is necessary to
estimate high accurate solar constant in CLM. But, it is difficult to satisfy the temporal stability of
atmosphere in usual locations, except for some special region, such as high elevation of
mountain. A sensitivity analysis for calibration in different aerosol models have been performed
by M.Tanaka (1986) [11], and there were about 2.6~10% retrieval errors of the calibration
constants by means of CLM as the aerosol optical depth varies based on a parabolic variation
corresponding changes with the extent of ±10%.
To remove the influence due to variant optical depth of aerosol in accordance with the unsteady
turbidity conditions during calibration period, T.Nakajima (1996) proposed an improved Langley
method in which the calibration are performed by simultaneous measurements combining the
direct-solar and circumsolar radiation [13]. The aerosol optical depth is estimated firstly by an
analysis of volume spectrum (AVS). In this analysis, the circumsolar radiances are replaced by a
relative intensity as equation (2).
)()(
)(
)( θθωτ
θ
θ qP
Fm
F
R +=
∆Ω
= (2)
where, R(θ) is the relative intensity of circumsolar radiance, F(θ), and normalized by direct
irradiance(F), approximate air mass (m) and the solid angle(∆ ). ω is the single scattering
albedo. )(θq indicates the multiple scattering contribution. )(θP is the total phase function of
aerosols and molecules at scattering angle is θ and given by.
ωτθτωθτωθ /))()(()( mmmaaa PPP += (3)
3. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 95
where aω , aτ and )(θaP are the single scattering albedo, the optical depth, and the phase
function of aerosol, respectively; and mω , mτ and )(θmP are corresponding quantities of air
molecule. Assume the aerosol particle is sphere and homogeneous, by Mie theory, )(θτω aaa P
and the aerosol optical depth can be defined as,
∫=
2
1
ln)()~,,()(
r
raaa rdrvmkrKP θθτω (4)
∫=
2
1
ln)()~,(
r
r
exta rdrvmkrKτ (5)
where ),()3/4()( 4
rnrrv π= )(rn is columnar radius distribution of aerosol. λπ /2=k ,
ξinm −=~ is refractive index, )~,( mkrKext , )~,,( mkrK θ are kernel functions and can be
calculated by Mie theory. Using an inversion scheme of solving radiative transfer equation to
correct repeatedly the multiple scattering contribution, )(θq [4], an approximate solutions of
volume spectrum, )(rv , can be estimated by circumsolar radiances. Then the aerosol optical
depth also can be estimated by equation (5). Thus, equation (1) can be rewritten by
aom mFmF τττ −=++ 0ln)(ln (6)
where oτ is ozone optical depth, and the calibration constants can be obtained by extrapolation
of the plot of the left item against amτ to amτ =0. This method is referred to Improved Langley
Method (ILM). Because most of influence due to variant optical depth of aerosol in accordance
with the turbidity atmosphere can be estimated by circumsolar radiances, the estimation
accuracies of calibration constants will be improved conspicuously comparing with the CLM, with
the plot of Fln against m .
On the other hand, the influences due to the small extent (θ<30°) of the circumsolar radiation, the
scattering of atmospheric molecule, the observation errors of circumsolar radiances, the estimate
errors of the volume spectrum and the other assumptions of atmospheric conditions, result in
insufficiency to reduce the contribution of multiple scattering in solving radiative transfer equation
by inversion scheme (T. Nakajima, 1996) [13]. Some errors will occur in estimation of the aerosol
optical depth by AVS. Thus it is hardly assured to estimate the aerosol optical depth accurately
for every wavelength of sunphotometer. Figure 1 shows the difference of the aerosol optical
depth estimated by the AVS and by reanalysis of volume spectrum from skyradiometer
measurement in several days.
4. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 96
FIGURE 1: The Differences of aerosol optical depth by means of AVS and reanalysis of volume
spectrum from air-mass 1.5 to 4.5. Data are observed by POM-1 of Skyradiometer2
in
11/26/2003, 12/03/2003 and 12/04/2003 at Saga, Japan
It is found that the differences of aerosol optical depth between the estimation by AVS and by
reanalysis sometime are larger than 10%. It means that the estimate accuracies of calibration
constants can become low by ILM.
3. THE PROPOSED METHOD
From Figure 1, it is also found that the differences of aerosol optical depth in long wavelength are
small than that in the shorts. This is because the influences due to the multiple scattering in the
long wavelength are smaller, and the optical depth can be estimated accurately. This also means
that the estimate accuracies of the calibration constants are higher in long wavelengths than that
in the shorts. On the other hand, Ratio Langley method, proposed by B.W. Forgan (1994) [12], is
the method which is depend on a known calibration for a reference wavelength to permit
calibration at the others by assuming the relative size distribution of aerosol to remain constant as
equation (7), so that the ratio of aerosol optical depth between the different wavelengths are
assure to be constant as equation (8).
rdrfrKtAt exta ln)(),()(),( ∫= λπλτ (7)
ψλτλτλτλτ == ),(/),(),(/),( 020121 tttt aaaa
(8)
where )(rf is the relative size distribution that is dependent only on particle radius r, and )(tA
is the multiplier necessary to produce the correct size distribution at some time t. Thus the
calibrations at the other wavelengths can be performed by using the reference wavelength as
equation (9).
)()(ln))()(()(ln 010111 λτψλλτλτλ aom mFmF −=++ (9)
where 0λ , 1λ are the reference wavelength and the calibrated wavelength, respectively. ψ is a
constant. Because )( 0λτam has been calibrated well, it is calculated accurately )(ln 10 λF by
least square regression for equation (9) between the left item and )( 0λτam . It is possible to
improve calibration accuracies by selecting the long wavelength with being calibrated well by ILM
as reference to perform calibration at the others.
Therefore, a multi-stage calibration method is proposed. In the proposed method, accurate
calibration constants in the long wavelength which are estimated by ILM are used. Also it is used
as a reference to that at the other wavelengths. Because the ILM does work well in the code of
Skyrad.pack, developed by T.Nakajima (1996) [4], this code will be used in our algorithm. The
proposed process flow is shown in Figure 2.
2
It is similar to the Aureolemeter for AERONET which is manufactured by Prede Co. Ltd.
5. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 97
FIGURE 2: The algorithm of multi stage calibration method.
Firstly, the code Skyrad.pack.v42 is introduced in our algorithm. It includes three processes, level
0, calibration and level 1. In the level 0, based on AVS, the aerosol optical depth and the volume
spectrum are approximately estimated by the circumsolar radiation. In the calibration, the
calibrations are performed by ILM. In the level 1, on the other hand, it is used as the calibration
constants estimated by ILM, and then it is combined with the direct and sky radiances. Thus,
more accurate solution of aerosol optical depth, aerosol volume spectrum, refractive index of
aerosol can be estimated by reanalysis of volume spectrum. Consequently, it is used the aerosol
optical depth which is estimated from the level 1, i.e. reanalysis of volume spectrum, instead of
that from the level 0. Then it is performed a calibration for the reference wavelength selected to
obtain more accurate calibration constants. Finally, based on RLM, the well-calibrated at the
reference wavelength can be used for that at the other wavelengths.
4. NUMERICAL SIMULATIONS
A numerical simulation is conducted to check a validity of the proposed method by comparing to
the ILM method. The wavelengths are selected 0.4, 0.5, 0.675, 0.87 and 1.02um in accordance
with the POM-1 of Skyradiometer manufactured by Prede Co. Ltd. The reference wavelength is
set at 0.87um. The simulated data is generated by the Skyrad.pack.v42. The aerosol size
distributions are defined two modes of log-normal distributions (bi-modal) as follows
∑=
−
−=
2
1
2
2
)
log2
)log(log
exp(
log2
)(ln
i i
i
i
i rrC
rn
σσπ
(10)
where n(lnr)dlnr is the number density of particles between radii r and r+dlnr. The values of iC is
set as 1, and iσ , ir are set as same as the aerosol type observed at Saga, Japan in 2003. The
set of parameters are shown in Table 1.
No m ode C i ri(um ) σi
1 1.0 0.37 1.95
2 1.0 3.06 2.36
TABLE 1: The parameters for log-normal distribution.
6. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 98
The refractive index of aerosol is set m=1.50-0.01i. Solar irradiance of extra-atmosphere is set
1.0. The variation of the optical depth of aerosol with time is given as follows (Shaw, 1976) [14].
)1( 2
0 taa αττ += (11)
where 0aτ is aerosol optical depth at noon, and are set 0.1 and 0.2. α is assumed to be 0.011.
So that the aerosol optical depth changes in the extent of 0~20% of 0aτ as the air-mass vary
from 1.5 to 4.5. We set 0,-3~3%,-5~5% random errors for the sky radiances to evaluate the
calibration accuracies by ILM and the proposed method (PM). Figure 3 (a) and (b) shows the
estimate errors of aerosol optical depth by AVS and reanalysis of volume spectrum for the
wavelengths 0.4, 0.5 and 0.87µm with no error in sky radiances.
From this Figure, it may be concluded that,
(1) Estimate accuracies of the aerosol optical depth by reanalysis of volume spectrum are almost
better than that by AVS,
(2) Estimate errors of aerosol optical depth in band 0.87µm are smaller than that in 0.4 and
0.5µm. Similarly, the cases with -3~3% and -5~5% errors in sky radiances, also can be concluded
the same points as above.
Table 2(a), (b), (c) show that the comparisons of estimate accuracies of calibration constants by
ILM and PM for the aforementioned five wavelengths. From the table, it is found that the
calibration accuracies are higher by PM, especially in short wavelength 0.4µm.
To evaluate the influence of calibration accuracies due to changing of the relative size
distribution, it is set iσ and ir ±3% and ±5% change in equation (10). The calibration results
are shown in Table 3. From the table it may say that the calibration accuracies of the proposed
method are higher than that of PM in ±3% change.
5. VALIDATION THROUGH OBSERVATIONS
It is also validated the proposed method by analysis of observation data from POM-1 of
Skyradiomater. The POM-01 Skyradiometer can measure the direct, diffuse solar irradiance as
well as aureole in solar almucantar and in the principal plane. It consists of the seven filters which
the central wavelengths are at 0.315, 0.40, 0.50, 0.675, 0.870, 0.94 and 1.02µm. The filters of the
wavelength center at 0.315µm and 0.94µm are used for estimation of O3 concentration and
precipitable water, respectively. The other filters are used for aerosol optical depth
measurements. The instrument is acquired with a 0.5 half angle field of view. The instrument is
located at Saga University, and observations were performed from September 2003 to May 2004.
Data of 15 days are selected; these days are cloud-free.
(a) (b)
7. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 99
FIGURE 3: The estimation errors of aerosol optical depth by the method of AVS and the method
through reanalysis of volume spectrum at the wavelength of 0.4, 0.5, and 0.87µm without any
error in sky radiance measurement.
The Figure 4 shows the calibration constants at the reference wavelength estimated by ILM in the
15 days. It is found that the accuracies are high enough with the standard deviation of only 1%.
FIGURE 4: Calibration for the reference wavelength by ILM
The calibration results of ILM and PM methods are shown in Figure 5 (a) and (b) and Table 5.
Figure 5 (a) shows calibration coefficient for the wavelength of 0.4µm and 0.5µm, while Figure 5
(b) also shows calibration coefficient for 0.675µm and 1.02µm. Table 5 indicates the standard
deviations for each band in 15 days. Consequently, it is found that the standard deviation of PM
method is smaller than that of ILM method, especially at the wavelength of 0.4µm. This also
means that the number of times of calibration required for PM is less than that for ILM to attain
the same accuracies.
0.1 0.2 0.3
W V (um ) ILM PM ILM PM ILM PM
0.4 0.0008 0.0006 0.0029 0.0009 0.013 0.0014
0.5 0.0003 0.0006 0.0015 0.0006 0.01 0.0009
0.675 0.0012 0.0005 0.0006 0.0006 0.005 0.0005
0.87 0.0002 0.0002 0.0001 0.0001 0.003 0.0004
1.02 0.0002 0.0002 0.0001 0.0001 0.002 0.0004
(a) No error in circumsolar radiances
0.1 0.2 0.3
W V(um ) ILM PM ILM PM ILM PM
0.4 0.011 0.004 0.017 0.009 0.023 0.011
0.5 0.008 0.003 0.009 0.006 0.012 0.009
0.675 0.003 0.002 0.003 0.002 0.015 0.007
0.87 0.006 0.002 0.001 0.001 0.002 0.001
1.02 0.002 0.001 0.001 0.002 0.001 0.001
(b)-3%~3% random errors in circumsolar radiances
0.1 0.2 0.3
W V(um ) ILM PM ILM PM ILM PM
0.4 0.013 0.007 0.015 0.005 0.027 0.014
0.5 0.006 0.006 0.005 0.004 0.011 0.01
0.675 0.003 0.003 0.003 0.003 0.007 0.005
0.87 0.001 0.001 0.002 0.001 0.001 0.001
1.02 0.001 0.001 0.002 0.001 0.001 0.002
(c)-5%~5% random errors in circumsolar radiances.
8. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 100
TABLE 2: Comparison of estimation error for calibration from ILM and the proposed method as
the optical depth are 0.1, 0.2 and 0.3.
standard deviation
W V(um ) ILM PM
0.4 0.03858 0.02016
0.5 0.02219 0.01691
0.675 0.01837 0.01295
1.02 0.01022 0.00938
TABLE 3: Comparison of the Standard deviations between ILM and PM.
(a) 0.4 and 0.5µm (b) 0.675 and 1.02µm
FIGURE 5: Calibration for 0.4µm and 0.5µm by ILM and PM.
6. CONCLUSIONS
A multi stage calibration method combining Improved Langley Method with Ratio Langley Method
is proposed in this paper. From the numerical simulation, the estimation errors of aerosol optical
depth result in calibration precision decrease by ILM. Through a comparison between ILM and
the proposed method using band 0.87µm as reference, the largest calibration errors are 0.0014,
0.0428 by PM are smaller than that by ILM (0.011,0.0489) for sky radiances without any error and
-3~+3%, -5~+5% errors. By analyzing the observation data of 15 days with POM-1 of
Skyradiometer, the largest standard deviation of calibration constants by PM is 0.02016, and is
smaller than that by ILM (0.03858). Thus it may say that the proposed calibration method is
superior to the other conventional methods.
7. ACKNOWLEDGEMENTS
The authors thank Prof, T.Nakajima and Engineer, M.Yamano of Center for Climate System
Research, The University of Tokyo for their constructive comments.
8. REFERENCES
[1] Ramachandran, Justice, Abrams(Edt.),Kohei Arai et al., Land Remote Sensing and Global
Environmental Changes, Part-II, Sec.5: ASTER VNIR and SWIR Radiometric Calibration
and Atmospheric Correction, 83-116, Springer 2010.
[2] Arai, K, Fundamental theory of remote sensing, Gakujutsu-Tosho-Shuppan Co. Ltd., 2001.
[3] Arai, K. Self learning on remote sensing, Morikita-Shuppan Co. Ltd., 2004.
9. Kohei Arai & Xing Ming Liang
International Journal of Applied Science (IJAS), Volume (2) : Issue (3) : 2011 101
[4] Arai, K., X.M. Liang, Simultaneous estimation of aerosol refractive index and size distribution
using solar direct, diffuse and aureole based on simulated annealing, Journal of Remote
Sensing Society of Japan, 23, 1, 11-20, 2003.
[5] Arai K., X.M. Liang, Estimation method for Top of the Atmosphere radiance taking into
account up and down welling polarization components, Journal of Japan Society of
Photogrammetry and Remote Sensing, 44, 3, 4-12, 2005.
[6] Liang X.M., K.Arai, Simultaneous estimation of aerosol refractive index and size distribution
taking into account solar direct, diffuse and aureole of polarization component, Journal of
Remote Sensing Society of Japan, 25, 4, 357-366, 2005.
[7] Arai K., X.M. Liang, Characterization of aerosols in Saga city areas, Japan with direct and
diffuse solar irradiance and aureole observations, Advances in Space Research, 39, 1, 23-
27, 2007.
[8] Arai K., Y.Iisaka and X.M. Linag, Aerosol parameter estimation with changing observation
angle of around based polarization radiometer, Advances in Space Research, 39, 1, 28-31,
2007.
[9] Arai K., X.M. Liang, Improvement of calibration accuracy of skyradiometer which allows
solar direct, aureole and diffuse measurements based on Improved Modified Langley,
Journal of Japan Society of Photogrammetry and Remote Sensing, 47, 4, 21-28, 2008.
[10] Schotland, R.M., Lea, T.K., Bias in a solar constant determination by the Langley method
due to structured atmospheric aerosol. Appl. Opt., 25, 2486-2491, 1986.
[11] Tanaka, M., Nakajima, T., Shiobara, M., Calibration of a sunphotometer by simultaneous
measurements of direct-solar and circumsolar radiations. Appl. Opt., 25, 1170-1176, 1986.
[12] Forgan, B.W., General method for calibrating sun photometers. Appl. Opt., 33, 4841-4850,
1994.
[13] Namajima, T., Tonna, G., Rao, R. et al. Use of sky brightness measurements from ground
for remote sensing of particulate polydispersions. Appl. Opt. 35, 2672-2686, 1996.
[14] Shaw, G.E., Error analysis of multi-wavelength sunphotometry. Pure Appl. Geophys., 114,
1, 1976.