Frequency analysis of extreme low mean annual rainfall events is important to water resource planners at catchment level because mean annual rainfall is an important parameter in determining mean annual runoff. Mean annual runoff is an important input in determining surface water available for water resource infrastructure development. In order to carry out frequency analysis of extreme low mean annual rainfall events, it is necessary to identify the best fit probability distribution models (PDMs) for the frequency analysis. The primary objective of the study was to develop two model identification criteria. The first criterion was developed to identify candidate probability distribution models from which the best fit probability distribution models were identified. The second criterion was applied to select the best fit probability distribution models from the candidate models. The secondary objectives were:
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
THE APPLICATION OF MATHEMATICAL MODELS IN MANAGEMENT OF AQUIFERamsjournal
Before feeling water -shortage crisis human has understood the importance of water From the
religious texts. Considering recent conditions of the world the water will replace most recent
boundaries, at future. Imamzadeh Jaafar plain is located 5 kilometers northeast of Gachsaran, south
of Kohgilooye and Boerahmad province. The plain has 61km 2 area extents and contains two,
alluvial and carbonate aquifers. These aquifers supply the water needs, agricultural, industrial and
domestic. Highly exploitation and transportation of groundwater resources, especially by National Oil
Company, caused highly drawdown in alluvial aquifer, 1.85m in a 5 years period from 1361 to
1365 as reported by Mahab Ghods Consulting Engineers. There are two artificial recharge
projects, 1 flood spreading system and 1 recharge ponds system, in the plain. To present the future
water resources management program the hydrogeological behaviors of the alluvial aquifer and the
effects of artificial recharge must be evaluated. edrock, hydrodynamic coefficients, topography, water
resources and were collected, field surveys were performed and required maps were prepared. Using
conceptual model and MODFLOW PMWIN code the mathematical model of the plain was
calibrated against water year 1380 -81 and then verified against water year 1384 - 85. The verified
model was used to predict future conditions of aquifer. The results implied the rapid response of
aquifer to precipitation due to high aquifer ransmissivity, positive water budget at year 1385
comparing year 65, change of direction of groundwater flow from plain outlet to the center of
plain in response to highly exploitation at the center of plain, water level in the wells located
downward the flood spreading system will raise as 1 to 6m and water level in t he wells located
downward the recharge pond system will lower as 1 to 4m.
II WORKSHOP INTERNACIONAL: GESTÃO SUSTENTÁVEL DE RECURSOS HÍDRICOS NA AGRICULTURA IRRIGADA:
Pesquisa, Políticas Públicas, Extensão Rural e Participação dos Agricultores do Nebraska, USA e do Oeste da Bahia, Brasil
AUDITÓRIO AIBA - BARREIRAS, BA
Application of GIS and MODFLOW to Ground Water Hydrology- A ReviewIJERA Editor
Groundwater is one of the most valuable natural resources, which supports human health, economic
development and ecological diversity. Due to over exploitation, the ground water systems are affected and
require management to maintain the conditions of ground water resources within acceptable limits. With the
development of computers and advances in information technology, efficient techniques for water management
has evolved. The main intent of the paper is to present a comprehensive review on application of GIS
(Geographic Information System) followed by coupling with MODFLOW package for ground water
management and development. Two major areas are discussed stating GIS applications in ground water
hydrology. (i) GIS based subsurface flow and pollution modelling (ii) Selection of artificial recharge sites.
Although the use of these techniques in groundwater studies has rapidly increased since last decade the sucess
rate is very limited. Based on this review , it is concluded that integation of GIS and MODFLOW have great
potential to revolutionize the monitoring and management of vital ground water resources in the future.
Analysis and Characterization of Kainji Reservoir Inflow System_ Crimson Publ...CrimsonpublishersEAES
Analysis and Characterization of Kainji Reservoir Inflow System by Mohammed J Mamman*, Otache Y Matins and Jibril Ibrahim in Environmental Analysis & Ecology Studies
Flood plains to floor drains design standard adaptation for urban flood risk ...Robert Muir
Presented to Flood Master Class by Insurance Business Magazine this presentation examines quantitative risk assessment of riparian, overland and wastewater (sanitary) sewer system flooding. Analysis of City of Toronto and City of Markham historical flooding is shown to be highly correlated to design standard limitations related to the era of construction. Risks are shown to extend over a range of scales from floodplain (river) to flood drain (homes) based on detailed GIS spatial analysis. Flood risk mitigation measures are presented to achieve design standard adaptation in local areas with specific limitations.
Comparison and Evaluation of Support Vector Machine and Gene Programming in R...AI Publications
Simulation and evaluation of sediment are important issues in water resources management. Common methods for measuring sediment concentration are generally time consuming and costly and sometimes does not have enough accuracy. In this research, we have tried to evaluate sediment amounts, using Support Vector Machine (SVM), for Kashkanriver, Iran, and compare it with common Gene-Expression Programming. The parameter of flow discharge for input in different time lags and the parameter of sediment for output dhuring contour time (1998-2018) considered. Criteria of correlation coefficient, root mean square error, mean absolute error and Nash Sutcliff coefficient were used to evaluate and compare the performance of models. The results showed that two models estimate sediment discharge with acceptable accuracy, but in terms of accuracy, the support vector machine model had the highest correlation coefficient (0.994), minimum root mean square error (0.001ton/day) , mean absolute error(0.001 ton/day) and the Nash Sutcliff (0.988) hence was chosen the prior in the verification stage. Finally, the results showed that the support vector machine has great capability in estimating minimum and maximum sediment discharge values.
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Application of Source Water Quantity and Quality Model to Dongshan PeninsulaeWater
Lake Tai is the third largest freshwater lake in China, bordering Jiangsu and Zhejiang provinces,
providing water to 30 million residents. A severe algal bloom in 2007 led to the development of the
Lake Tai Master Plan, launched by the National Development and Reform Commission (NDRC), to
improve nutrient management in the basin. Under a joint Australian China Environmental
Development Project, the Australian eWater Source Integrated Modelling System (IMS) was applied
to model water quantity and quality for a pilot area on the Dongshan Peninsula in the Lake Tai Basin.
Source is a powerful modelling platform for environmental management which can integrate many
physical processes and human impacts, successfully applied in over 70 basins across Australia.
Runoff is one of the most significant hydrological variables used in most of the water resources applications. Physiographically the area is characterized by undulating topography with plains and valleys. The Soil Conservation Service Curve Numbers also known as hydrologic soil group method were used in this study. This method is adaptable and suitable approach for quick runoff estimation and is approximately easy to use with minimum data and it gives good result. From the study yearly rainfall and runoff were estimated easily. The study area covers an area of 466.02 km2, having maximum length of 36.5 km. The maximum and minimum elevation of the basin is 569 m and 341 m above MSL, respectively.
Fitting Probability Distribution Functions To Discharge Variability Of Kaduna...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
A Holistic Approach for Determining the Characteristic Flow on Kangsabati Cat...ijceronline
Kangsabati river rises from the Chotanagpur plateau in the state of West Bengal, India and passes through the districts of Purulia, Bankura and Paschim Medinipur in West Bengal before joining into river Rupnarayan. It is life of these three districts of West Bengal situated in the western part of the state. The river has ephemeral characteristics i.e. it has low flow in the year round and have a high peak on a certain time basis. In the Kangasabati catchment hydrological study gives an evident that during the period every two years there is a chance of drought condition and consecutively after that there is a high flow year. In our study period from 1991 to 2010 there are six low streamflow year i.e. in that year there is less rainfall than the average rainfall on that area. The year 1991, 2002 and 2009 are the drought prone year and above that in 2010 the severe drought condition was seen and this is the lowest rainfall year among the last 20 years and the rainfall on this year is only 766 mm which is in an about 38% less rainfall than the average rainfall of the catchment. And the highest flood peak in the last twenty year is noted on 19th Aug 2007 as 377107.8 Mm3
THE APPLICATION OF MATHEMATICAL MODELS IN MANAGEMENT OF AQUIFERamsjournal
Before feeling water -shortage crisis human has understood the importance of water From the
religious texts. Considering recent conditions of the world the water will replace most recent
boundaries, at future. Imamzadeh Jaafar plain is located 5 kilometers northeast of Gachsaran, south
of Kohgilooye and Boerahmad province. The plain has 61km 2 area extents and contains two,
alluvial and carbonate aquifers. These aquifers supply the water needs, agricultural, industrial and
domestic. Highly exploitation and transportation of groundwater resources, especially by National Oil
Company, caused highly drawdown in alluvial aquifer, 1.85m in a 5 years period from 1361 to
1365 as reported by Mahab Ghods Consulting Engineers. There are two artificial recharge
projects, 1 flood spreading system and 1 recharge ponds system, in the plain. To present the future
water resources management program the hydrogeological behaviors of the alluvial aquifer and the
effects of artificial recharge must be evaluated. edrock, hydrodynamic coefficients, topography, water
resources and were collected, field surveys were performed and required maps were prepared. Using
conceptual model and MODFLOW PMWIN code the mathematical model of the plain was
calibrated against water year 1380 -81 and then verified against water year 1384 - 85. The verified
model was used to predict future conditions of aquifer. The results implied the rapid response of
aquifer to precipitation due to high aquifer ransmissivity, positive water budget at year 1385
comparing year 65, change of direction of groundwater flow from plain outlet to the center of
plain in response to highly exploitation at the center of plain, water level in the wells located
downward the flood spreading system will raise as 1 to 6m and water level in t he wells located
downward the recharge pond system will lower as 1 to 4m.
II WORKSHOP INTERNACIONAL: GESTÃO SUSTENTÁVEL DE RECURSOS HÍDRICOS NA AGRICULTURA IRRIGADA:
Pesquisa, Políticas Públicas, Extensão Rural e Participação dos Agricultores do Nebraska, USA e do Oeste da Bahia, Brasil
AUDITÓRIO AIBA - BARREIRAS, BA
Application of GIS and MODFLOW to Ground Water Hydrology- A ReviewIJERA Editor
Groundwater is one of the most valuable natural resources, which supports human health, economic
development and ecological diversity. Due to over exploitation, the ground water systems are affected and
require management to maintain the conditions of ground water resources within acceptable limits. With the
development of computers and advances in information technology, efficient techniques for water management
has evolved. The main intent of the paper is to present a comprehensive review on application of GIS
(Geographic Information System) followed by coupling with MODFLOW package for ground water
management and development. Two major areas are discussed stating GIS applications in ground water
hydrology. (i) GIS based subsurface flow and pollution modelling (ii) Selection of artificial recharge sites.
Although the use of these techniques in groundwater studies has rapidly increased since last decade the sucess
rate is very limited. Based on this review , it is concluded that integation of GIS and MODFLOW have great
potential to revolutionize the monitoring and management of vital ground water resources in the future.
Analysis and Characterization of Kainji Reservoir Inflow System_ Crimson Publ...CrimsonpublishersEAES
Analysis and Characterization of Kainji Reservoir Inflow System by Mohammed J Mamman*, Otache Y Matins and Jibril Ibrahim in Environmental Analysis & Ecology Studies
Flood plains to floor drains design standard adaptation for urban flood risk ...Robert Muir
Presented to Flood Master Class by Insurance Business Magazine this presentation examines quantitative risk assessment of riparian, overland and wastewater (sanitary) sewer system flooding. Analysis of City of Toronto and City of Markham historical flooding is shown to be highly correlated to design standard limitations related to the era of construction. Risks are shown to extend over a range of scales from floodplain (river) to flood drain (homes) based on detailed GIS spatial analysis. Flood risk mitigation measures are presented to achieve design standard adaptation in local areas with specific limitations.
Comparison and Evaluation of Support Vector Machine and Gene Programming in R...AI Publications
Simulation and evaluation of sediment are important issues in water resources management. Common methods for measuring sediment concentration are generally time consuming and costly and sometimes does not have enough accuracy. In this research, we have tried to evaluate sediment amounts, using Support Vector Machine (SVM), for Kashkanriver, Iran, and compare it with common Gene-Expression Programming. The parameter of flow discharge for input in different time lags and the parameter of sediment for output dhuring contour time (1998-2018) considered. Criteria of correlation coefficient, root mean square error, mean absolute error and Nash Sutcliff coefficient were used to evaluate and compare the performance of models. The results showed that two models estimate sediment discharge with acceptable accuracy, but in terms of accuracy, the support vector machine model had the highest correlation coefficient (0.994), minimum root mean square error (0.001ton/day) , mean absolute error(0.001 ton/day) and the Nash Sutcliff (0.988) hence was chosen the prior in the verification stage. Finally, the results showed that the support vector machine has great capability in estimating minimum and maximum sediment discharge values.
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Application of Source Water Quantity and Quality Model to Dongshan PeninsulaeWater
Lake Tai is the third largest freshwater lake in China, bordering Jiangsu and Zhejiang provinces,
providing water to 30 million residents. A severe algal bloom in 2007 led to the development of the
Lake Tai Master Plan, launched by the National Development and Reform Commission (NDRC), to
improve nutrient management in the basin. Under a joint Australian China Environmental
Development Project, the Australian eWater Source Integrated Modelling System (IMS) was applied
to model water quantity and quality for a pilot area on the Dongshan Peninsula in the Lake Tai Basin.
Source is a powerful modelling platform for environmental management which can integrate many
physical processes and human impacts, successfully applied in over 70 basins across Australia.
Runoff is one of the most significant hydrological variables used in most of the water resources applications. Physiographically the area is characterized by undulating topography with plains and valleys. The Soil Conservation Service Curve Numbers also known as hydrologic soil group method were used in this study. This method is adaptable and suitable approach for quick runoff estimation and is approximately easy to use with minimum data and it gives good result. From the study yearly rainfall and runoff were estimated easily. The study area covers an area of 466.02 km2, having maximum length of 36.5 km. The maximum and minimum elevation of the basin is 569 m and 341 m above MSL, respectively.
Fitting Probability Distribution Functions To Discharge Variability Of Kaduna...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
A Holistic Approach for Determining the Characteristic Flow on Kangsabati Cat...ijceronline
Kangsabati river rises from the Chotanagpur plateau in the state of West Bengal, India and passes through the districts of Purulia, Bankura and Paschim Medinipur in West Bengal before joining into river Rupnarayan. It is life of these three districts of West Bengal situated in the western part of the state. The river has ephemeral characteristics i.e. it has low flow in the year round and have a high peak on a certain time basis. In the Kangasabati catchment hydrological study gives an evident that during the period every two years there is a chance of drought condition and consecutively after that there is a high flow year. In our study period from 1991 to 2010 there are six low streamflow year i.e. in that year there is less rainfall than the average rainfall on that area. The year 1991, 2002 and 2009 are the drought prone year and above that in 2010 the severe drought condition was seen and this is the lowest rainfall year among the last 20 years and the rainfall on this year is only 766 mm which is in an about 38% less rainfall than the average rainfall of the catchment. And the highest flood peak in the last twenty year is noted on 19th Aug 2007 as 377107.8 Mm3
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
The study examined the characteristics of the Sumanpa stream’s Flow-Duration-Frequency Curve statistics for a period of 25years (1985-2009) and compared the 1990-1999 and 2000-2009 Flow-Duration-Curves. The high, low and mean Flow-Duration-Curves were also analysed. The discharge records were analysed to develop a general quantitative characterization of the stream’s flow variability. Streamflow data was generated from daily stage data using the rating curve model developed at the stream’s gauge station. Flow-Duration-Frequency-Curves were developed using the Weibull plotting position and used to analyse the catchment’s surface and groundwater storage and stream’s flow characteristics. The approach placed the midpoints of the moist, mid-range, and dry zones of the curves at 25th, 50th, and 75th percentiles, respectively. The high zone was centered at the 5th percentile, while the low zone was centered at the 95th percentile. For 95% of the time, the streamflowequalled or exceeded 0.14 m3s-1, at 5% it equalled or exceeded 45 m3s-1 and at 50% flow equalled or exceeded 5.53 m3s-1.
A Novel Method for Prevention of Bandwidth Distributed Denial of Service AttacksIJERD Editor
Distributed Denial of Service (DDoS) Attacks became a massive threat to the Internet. Traditional
Architecture of internet is vulnerable to the attacks like DDoS. Attacker primarily acquire his army of Zombies,
then that army will be instructed by the Attacker that when to start an attack and on whom the attack should be
done. In this paper, different techniques which are used to perform DDoS Attacks, Tools that were used to
perform Attacks and Countermeasures in order to detect the attackers and eliminate the Bandwidth Distributed
Denial of Service attacks (B-DDoS) are reviewed. DDoS Attacks were done by using various Flooding
techniques which are used in DDoS attack.
The main purpose of this paper is to design an architecture which can reduce the Bandwidth
Distributed Denial of service Attack and make the victim site or server available for the normal users by
eliminating the zombie machines. Our Primary focus of this paper is to dispute how normal machines are
turning into zombies (Bots), how attack is been initiated, DDoS attack procedure and how an organization can
save their server from being a DDoS victim. In order to present this we implemented a simulated environment
with Cisco switches, Routers, Firewall, some virtual machines and some Attack tools to display a real DDoS
attack. By using Time scheduling, Resource Limiting, System log, Access Control List and some Modular
policy Framework we stopped the attack and identified the Attacker (Bot) machines
Hearing loss is one of the most common human impairments. It is estimated that by year 2015 more
than 700 million people will suffer mild deafness. Most can be helped by hearing aid devices depending on the
severity of their hearing loss. This paper describes the implementation and characterization details of a dual
channel transmitter front end (TFE) for digital hearing aid (DHA) applications that use novel micro
electromechanical- systems (MEMS) audio transducers and ultra-low power-scalable analog-to-digital
converters (ADCs), which enable a very-low form factor, energy-efficient implementation for next-generation
DHA. The contribution of the design is the implementation of the dual channel MEMS microphones and powerscalable
ADC system.
Influence of tensile behaviour of slab on the structural Behaviour of shear c...IJERD Editor
-A composite beam is composed of a steel beam and a slab connected by means of shear connectors
like studs installed on the top flange of the steel beam to form a structure behaving monolithically. This study
analyzes the effects of the tensile behavior of the slab on the structural behavior of the shear connection like slip
stiffness and maximum shear force in composite beams subjected to hogging moment. The results show that the
shear studs located in the crack-concentration zones due to large hogging moments sustain significantly smaller
shear force and slip stiffness than the other zones. Moreover, the reduction of the slip stiffness in the shear
connection appears also to be closely related to the change in the tensile strain of rebar according to the increase
of the load. Further experimental and analytical studies shall be conducted considering variables such as the
reinforcement ratio and the arrangement of shear connectors to achieve efficient design of the shear connection
in composite beams subjected to hogging moment.
Gold prospecting using Remote Sensing ‘A case study of Sudan’IJERD Editor
Gold has been extracted from northeast Africa for more than 5000 years, and this may be the first
place where the metal was extracted. The Arabian-Nubian Shield (ANS) is an exposure of Precambrian
crystalline rocks on the flanks of the Red Sea. The crystalline rocks are mostly Neoproterozoic in age. ANS
includes the nations of Israel, Jordan. Egypt, Saudi Arabia, Sudan, Eritrea, Ethiopia, Yemen, and Somalia.
Arabian Nubian Shield Consists of juvenile continental crest that formed between 900 550 Ma, when intra
oceanic arc welded together along ophiolite decorated arc. Primary Au mineralization probably developed in
association with the growth of intra oceanic arc and evolution of back arc. Multiple episodes of deformation
have obscured the primary metallogenic setting, but at least some of the deposits preserve evidence that they
originate as sea floor massive sulphide deposits.
The Red Sea Hills Region is a vast span of rugged, harsh and inhospitable sector of the Earth with
inimical moon-like terrain, nevertheless since ancient times it is famed to be an abode of gold and was a major
source of wealth for the Pharaohs of ancient Egypt. The Pharaohs old workings have been periodically
rediscovered through time. Recent endeavours by the Geological Research Authority of Sudan led to the
discovery of a score of occurrences with gold and massive sulphide mineralizations. In the nineties of the
previous century the Geological Research Authority of Sudan (GRAS) in cooperation with BRGM utilized
satellite data of Landsat TM using spectral ratio technique to map possible mineralized zones in the Red Sea
Hills of Sudan. The outcome of the study mapped a gossan type gold mineralization. Band ratio technique was
applied to Arbaat area and a signature of alteration zone was detected. The alteration zones are commonly
associated with mineralization. The alteration zones are commonly associated with mineralization. A filed check
confirmed the existence of stock work of gold bearing quartz in the alteration zone. Another type of gold
mineralization that was discovered using remote sensing is the gold associated with metachert in the Atmur
Desert.
Reducing Corrosion Rate by Welding DesignIJERD Editor
The paper addresses the importance of welding design to prevent corrosion at steel. Welding is
used to join pipe, profiles at bridges, spindle, and a lot more part of engineering construction. The
problems happened associated with welding are common issues in these fields, especially corrosion.
Corrosion can be reduced with many methods, they are painting, controlling humidity, and also good
welding design. In the research, it can be found that reducing residual stress on the welding can be
solved in corrosion rate reduction problem.
Preheating on 500oC and 600oC give better condition to reduce corosion rate than condition after
preheating 400oC. For all welding groove type, material with 500oC and 600oC preheating after 14 days
corrosion test is 0,5%-0,69% lost. Material with 400oC preheating after 14 days corrosion test is 0,57%-0,76%
lost.
Welding groove also influence corrosion rate. X and V type welding groove give better condition to reduce
corrosion rate than use 1/2V and 1/2 X welding groove. After 14 days corrosion test, the samples with
X welding groove type is 0,5%-0,57% lost. The samples with V welding groove after 14 days corrosion test is
0,51%-0,59% lost. The samples with 1/2V and 1/2X welding groove after 14 days corrosion test is 0,58%-
0,71% lost.
Router 1X3 – RTL Design and VerificationIJERD Editor
Routing is the process of moving a packet of data from source to destination and enables messages
to pass from one computer to another and eventually reach the target machine. A router is a networking device
that forwards data packets between computer networks. It is connected to two or more data lines from different
networks (as opposed to a network switch, which connects data lines from one single network). This paper,
mainly emphasizes upon the study of router device, it‟s top level architecture, and how various sub-modules of
router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top
module.
Active Power Exchange in Distributed Power-Flow Controller (DPFC) At Third Ha...IJERD Editor
This paper presents a component within the flexible ac-transmission system (FACTS) family, called
distributed power-flow controller (DPFC). The DPFC is derived from the unified power-flow controller (UPFC)
with an eliminated common dc link. The DPFC has the same control capabilities as the UPFC, which comprise
the adjustment of the line impedance, the transmission angle, and the bus voltage. The active power exchange
between the shunt and series converters, which is through the common dc link in the UPFC, is now through the
transmission lines at the third-harmonic frequency. DPFC multiple small-size single-phase converters which
reduces the cost of equipment, no voltage isolation between phases, increases redundancy and there by
reliability increases. The principle and analysis of the DPFC are presented in this paper and the corresponding
simulation results that are carried out on a scaled prototype are also shown.
Mitigation of Voltage Sag/Swell with Fuzzy Control Reduced Rating DVRIJERD Editor
Power quality has been an issue that is becoming increasingly pivotal in industrial electricity
consumers point of view in recent times. Modern industries employ Sensitive power electronic equipments,
control devices and non-linear loads as part of automated processes to increase energy efficiency and
productivity. Voltage disturbances are the most common power quality problem due to this the use of a large
numbers of sophisticated and sensitive electronic equipment in industrial systems is increased. This paper
discusses the design and simulation of dynamic voltage restorer for improvement of power quality and
reduce the harmonics distortion of sensitive loads. Power quality problem is occurring at non-standard
voltage, current and frequency. Electronic devices are very sensitive loads. In power system voltage sag,
swell, flicker and harmonics are some of the problem to the sensitive load. The compensation capability
of a DVR depends primarily on the maximum voltage injection ability and the amount of stored
energy available within the restorer. This device is connected in series with the distribution feeder at
medium voltage. A fuzzy logic control is used to produce the gate pulses for control circuit of DVR and the
circuit is simulated by using MATLAB/SIMULINK software.
Study on the Fused Deposition Modelling In Additive ManufacturingIJERD Editor
Additive manufacturing process, also popularly known as 3-D printing, is a process where a product
is created in a succession of layers. It is based on a novel materials incremental manufacturing philosophy.
Unlike conventional manufacturing processes where material is removed from a given work price to derive the
final shape of a product, 3-D printing develops the product from scratch thus obviating the necessity to cut away
materials. This prevents wastage of raw materials. Commonly used raw materials for the process are ABS
plastic, PLA and nylon. Recently the use of gold, bronze and wood has also been implemented. The complexity
factor of this process is 0% as in any object of any shape and size can be manufactured.
Spyware triggering system by particular string valueIJERD Editor
This computer programme can be used for good and bad purpose in hacking or in any general
purpose. We can say it is next step for hacking techniques such as keylogger and spyware. Once in this system if
user or hacker store particular string as a input after that software continually compare typing activity of user
with that stored string and if it is match then launch spyware programme.
A Blind Steganalysis on JPEG Gray Level Image Based on Statistical Features a...IJERD Editor
This paper presents a blind steganalysis technique to effectively attack the JPEG steganographic
schemes i.e. Jsteg, F5, Outguess and DWT Based. The proposed method exploits the correlations between
block-DCTcoefficients from intra-block and inter-block relation and the statistical moments of characteristic
functions of the test image is selected as features. The features are extracted from the BDCT JPEG 2-array.
Support Vector Machine with cross-validation is implemented for the classification.The proposed scheme gives
improved outcome in attacking.
Secure Image Transmission for Cloud Storage System Using Hybrid SchemeIJERD Editor
- Data over the cloud is transferred or transmitted between servers and users. Privacy of that
data is very important as it belongs to personal information. If data get hacked by the hacker, can be
used to defame a person’s social data. Sometimes delay are held during data transmission. i.e. Mobile
communication, bandwidth is low. Hence compression algorithms are proposed for fast and efficient
transmission, encryption is used for security purposes and blurring is used by providing additional
layers of security. These algorithms are hybridized for having a robust and efficient security and
transmission over cloud storage system.
Application of Buckley-Leverett Equation in Modeling the Radius of Invasion i...IJERD Editor
A thorough review of existing literature indicates that the Buckley-Leverett equation only analyzes
waterflood practices directly without any adjustments on real reservoir scenarios. By doing so, quite a number
of errors are introduced into these analyses. Also, for most waterflood scenarios, a radial investigation is more
appropriate than a simplified linear system. This study investigates the adoption of the Buckley-Leverett
equation to estimate the radius invasion of the displacing fluid during waterflooding. The model is also adopted
for a Microbial flood and a comparative analysis is conducted for both waterflooding and microbial flooding.
Results shown from the analysis doesn’t only records a success in determining the radial distance of the leading
edge of water during the flooding process, but also gives a clearer understanding of the applicability of
microbes to enhance oil production through in-situ production of bio-products like bio surfactans, biogenic
gases, bio acids etc.
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
- Gesture gaming is a method by which users having a laptop/pc/x-box play games using natural or
bodily gestures. This paper presents a way of playing free flash games on the internet using an ordinary webcam
with the help of open source technologies. Emphasis in human activity recognition is given on the pose
estimation and the consistency in the pose of the player. These are estimated with the help of an ordinary web
camera having different resolutions from VGA to 20mps. Our work involved giving a 10 second documentary to
the user on how to play a particular game using gestures and what are the various kinds of gestures that can be
performed in front of the system. The initial inputs of the RGB values for the gesture component is obtained by
instructing the user to place his component in a red box in about 10 seconds after the short documentary before
the game is finished. Later the system opens the concerned game on the internet on popular flash game sites like
miniclip, games arcade, GameStop etc and loads the game clicking at various places and brings the state to a
place where the user is to perform only gestures to start playing the game. At any point of time the user can call
off the game by hitting the esc key and the program will release all of the controls and return to the desktop. It
was noted that the results obtained using an ordinary webcam matched that of the Kinect and the users could
relive the gaming experience of the free flash games on the net. Therefore effective in game advertising could
also be achieved thus resulting in a disruptive growth to the advertising firms.
Hardware Analysis of Resonant Frequency Converter Using Isolated Circuits And...IJERD Editor
-LLC resonant frequency converter is basically a combo of series as well as parallel resonant ckt. For
LCC resonant converter it is associated with a disadvantage that, though it has two resonant frequencies, the
lower resonant frequency is in ZCS region[5]. For this application, we are not able to design the converter
working at this resonant frequency. LLC resonant converter existed for a very long time but because of
unknown characteristic of this converter it was used as a series resonant converter with basically a passive
(resistive) load. . Here, it was designed to operate in switching frequency higher than resonant frequency of the
series resonant tank of Lr and Cr converter acts very similar to Series Resonant Converter. The benefit of LLC
resonant converter is narrow switching frequency range with light load[6] . Basically, the control ckt plays a
very imp. role and hence 555 Timer used here provides a perfect square wave as the control ckt provides no
slew rate which makes the square wave really strong and impenetrable. The dead band circuit provides the
exclusive dead band in micro seconds so as to avoid the simultaneous firing of two pairs of IGBT’s where one
pair switches off and the other on for a slightest period of time. Hence, the isolator ckt here is associated with
each and every ckt used because it acts as a driver and an isolation to each of the IGBT is provided with one
exclusive transformer supply[3]. The IGBT’s are fired using the appropriate signal using the previous boards
and hence at last a high frequency rectifier ckt with a filtering capacitor is used to get an exact dc
waveform .The basic goal of this particular analysis is to observe the wave forms and characteristics of
converters with differently positioned passive elements in the form of tank circuits.
Simulated Analysis of Resonant Frequency Converter Using Different Tank Circu...IJERD Editor
LLC resonant frequency converter is basically a combo of series as well as parallel resonant ckt. For
LCC resonant converter it is associated with a disadvantage that, though it has two resonant frequencies, the
lower resonant frequency is in ZCS region [5]. For this application, we are not able to design the converter
working at this resonant frequency. LLC resonant converter existed for a very long time but because of
unknown characteristic of this converter it was used as a series resonant converter with basically a passive
(resistive) load. . Here, it was designed to operate in switching frequency higher than resonant frequency of the
series resonant tank of Lr and Cr converter acts very similar to Series Resonant Converter. The benefit of LLC
resonant converter is narrow switching frequency range with light load[6] . Basically, the control ckt plays a
very imp. role and hence 555 Timer used here provides a perfect square wave as the control ckt provides no
slew rate which makes the square wave really strong and impenetrable. The dead band circuit provides the
exclusive dead band in micro seconds so as to avoid the simultaneous firing of two pairs of IGBT’s where one
pair switches off and the other on for a slightest period of time. Hence, the isolator ckt here is associated with
each and every ckt used because it acts as a driver and an isolation to each of the IGBT is provided with one
exclusive transformer supply[3]. The IGBT’s are fired using the appropriate signal using the previous boards
and hence at last a high frequency rectifier ckt with a filtering capacitor is used to get an exact dc
waveform .The basic goal of this particular analysis is to observe the wave forms and characteristics of
converters with differently positioned passive elements in the form of tank circuits. The supported simulation
is done through PSIM 6.0 software tool
Amateurs Radio operator, also known as HAM communicates with other HAMs through Radio
waves. Wireless communication in which Moon is used as natural satellite is called Moon-bounce or EME
(Earth -Moon-Earth) technique. Long distance communication (DXing) using Very High Frequency (VHF)
operated amateur HAM radio was difficult. Even with the modest setup having good transceiver, power
amplifier and high gain antenna with high directivity, VHF DXing is possible. Generally 2X11 YAGI antenna
along with rotor to set horizontal and vertical angle is used. Moon tracking software gives exact location,
visibility of Moon at both the stations and other vital data to acquire real time position of moon.
“MS-Extractor: An Innovative Approach to Extract Microsatellites on „Y‟ Chrom...IJERD Editor
Simple Sequence Repeats (SSR), also known as Microsatellites, have been extensively used as
molecular markers due to their abundance and high degree of polymorphism. The nucleotide sequences of
polymorphic forms of the same gene should be 99.9% identical. So, Microsatellites extraction from the Gene is
crucial. However, Microsatellites repeat count is compared, if they differ largely, he has some disorder. The Y
chromosome likely contains 50 to 60 genes that provide instructions for making proteins. Because only males
have the Y chromosome, the genes on this chromosome tend to be involved in male sex determination and
development. Several Microsatellite Extractors exist and they fail to extract microsatellites on large data sets of
giga bytes and tera bytes in size. The proposed tool “MS-Extractor: An Innovative Approach to extract
Microsatellites on „Y‟ Chromosome” can extract both Perfect as well as Imperfect Microsatellites from large
data sets of human genome „Y‟. The proposed system uses string matching with sliding window approach to
locate Microsatellites and extracts them.
Importance of Measurements in Smart GridIJERD Editor
- The need to get reliable supply, independence from fossil fuels, and capability to provide clean
energy at a fixed and lower cost, the existing power grid structure is transforming into Smart Grid. The
development of a smart energy distribution grid is a current goal of many nations. A Smart Grid should have
new capabilities such as self-healing, high reliability, energy management, and real-time pricing. This new era
of smart future grid will lead to major changes in existing technologies at generation, transmission and
distribution levels. The incorporation of renewable energy resources and distribution generators in the existing
grid will increase the complexity, optimization problems and instability of the system. This will lead to a
paradigm shift in the instrumentation and control requirements for Smart Grids for high quality, stable and
reliable electricity supply of power. The monitoring of the grid system state and stability relies on the
availability of reliable measurement of data. In this paper the measurement areas that highlight new
measurement challenges, development of the Smart Meters and the critical parameters of electric energy to be
monitored for improving the reliability of power systems has been discussed.
Study of Macro level Properties of SCC using GGBS and Lime stone powderIJERD Editor
One of the major environmental concerns is the disposal of the waste materials and utilization of
industrial by products. Lime stone quarries will produce millions of tons waste dust powder every year. Having
considerable high degree of fineness in comparision to cement this material may be utilized as a partial
replacement to cement. For this purpose an experiment is conducted to investigate the possibility of using lime
stone powder in the production of SCC with combined use GGBS and how it affects the fresh and mechanical
properties of SCC. First SCC is made by replacing cement with GGBS in percentages like 10, 20, 30, 40, 50 and
by taking the optimum mix with GGBS lime stone powder is blended to mix in percentages like 5, 10, 15, 20 as
a partial replacement to cement. Test results shows that the SCC mix with combination of 30% GGBS and 15%
limestone powder gives maximum compressive strength and fresh properties are also in the limits prescribed by
the EFNARC.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual Rainfall Events
1. International Journal of Engineering Research and Development
e-ISSN: 2278-067X, p-ISSN: 2278-800X, www.ijerd.com
Volume 11, Issue 04 (April 2015), PP.34-53
34
Best Fit and Selection of Probability Distribution Models for
Frequency Analysis of Extreme Mean Annual Rainfall Events
EM Masereka1
, FAO Otieno2
, GM Ochieng3
J Snyman4
1
D.Tech. student Civil Engineering Department, Tshwane University of Technology,
Pretoria, South Africa
2
Prof and Vice Chancellor, Durban University of Technology, Durban, South Africa
3
Prof and Head of Dept; Civil Engineering and Building, Vaal University of Technology,
Vanderbijlpark, South Africa
4
Dr, J. Snyman; Senior Lecturer Dept; Civil Engineering and Building Tshwane
University of Technology; Pretoria South Africa
Abstract:- Frequency analysis of extreme low mean annual rainfall events is important to water resource
planners at catchment level because mean annual rainfall is an important parameter in determining mean annual
runoff. Mean annual runoff is an important input in determining surface water available for water resource
infrastructure development. In order to carry out frequency analysis of extreme low mean annual rainfall events,
it is necessary to identify the best fit probability distribution models (PDMs) for the frequency analysis. The
primary objective of the study was to develop two model identification criteria. The first criterion was
developed to identify candidate probability distribution models from which the best fit probability distribution
models were identified. The second criterion was applied to select the best fit probability distribution models
from the candidate models. The secondary objectives were: to apply the developed criteria to identify the
candidate and best fit probability distribution models and carry out frequency analysis of extreme low mean
annual rainfall events in the Sabie river catchment which is one of water deficit catchments in South Africa.
Although not directly correlated, mean annual rainfall determines mean annual runoff at catchment level.
Therefore frequency analysis of mean annual rainfall events is important part of estimating mean annual runoff
events at catchment level. From estimated annual runoff figures water resource available at catchment level can
be estimated. This makes mean annual rainfall modeling important for water resource planning and management
at catchment level.
The two model identification criteria which were developed are: Candidate Model Identification Criterion
(CMIC) and Least Sum of Statistics Model Identification Criterion (LSSMIC).
CMIC and LSSMIC were applied to identify candidate models and best fit models for frequency analysis of
distribution of extreme low mean annual rainfall events of the 8 rainfall zones in the Sabie river catchment. The
mean annual rainfall data for the period 1920-2004 obtained from the Water Research Commission of South
Africa was used in this study. Points below threshold method (PBTM) was applied to obtain the samples of
extreme low mean annual rainfall events from each of the 8 rainfall zones. The long term mean of 85 years of
each of the 8 rainfall zones was chosen as the threshold.
The identification of the best-fit models for frequency analysis of extreme low mean annual rainfall events in
each of the 8 rainfall zones was carried out in 2 stages. Stage 1 was the application of CMIC to identify
candidate models. Stage 2 was the application of LSSMIC to identify the best fit models from the candidate
models. The performance of CMIC and LSSMIC was assessed by application of Probability-Probability (P-P)
plots.
Although P-P plot results cannot be considered completely conclusive, CMIC and LSSMIC criteria make useful
tools as model selection method for frequency analysis of extreme mean annual rainfall events.
The results from the application of CMIC and LSSMIC showed that the best fit models for frequency analysis of
extreme low mean annual rainfall events in the Sabie river catchment are; Log Pearson 3, Generalised Logistic
and Extreme Generalised Value.
Keywords:- Best-fit probability distribution function, Candidate probability distribution functions, candidate
model identification criterion (CMIC) Least sum of statistics model selection criterion (LSSMIC)
I. INTRODUCTION
A. Water resources situation in South Africa
South Africa is made of 19 water management areas (WMA), 11 of these water management areas are
water deficit catchment where water resource demand is greater than the available water resource. Estimates
carried out by Department of Water and Forestry indicate that by 2025 two or more additional Water
2. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
35
Management Areas will experience water deficit situation (IWMI, 1998). Although only 11 out of 19 water
management areas are in situation of water resource deficit, on average the whole country can be classified as
water stressed (IWMI, 1998). The annual fresh water availability is estimated to be less than 1700m³ per capita.
The 1700m³ per capita is taken as the threshold or index for water stress. Prolonged periods of low rainfall
periods result into low production agricultural products.
South Africa depends on surface water resources for most of its domestic, industrial and irrigation
requirements (NWRS, 2003). There are no big rivers in South Africa, so the surface water resources are mainly
in form of runoff. The estimated mean annual runoff under natural condition is 49,000 M m³/a and the mean
annual rainfall is 465 mm (NWRS, 2003). The utilizable groundwater exploitation potential is estimated at
7500Mm3/a (NWRS, 2003). Although not directly correlated the amount of mean annual runoff depends on the
amount of mean annual rainfall. Mean annual run off and mean annual rainfall are important variables for water
resource planning at catchment level. In planning water resource systems at catchment level, it is necessary to
consider the impacts of extreme scenarios of both mean annual rainfall and mean annual runoff.
B. Modeling the distribution of extreme mean annual rainfall events
Modeling of the distribution of extreme mean annual rainfall events is based on the assumption that
these events are independent and identically distributed random events. This process is called stochastic
modeling. Stochastic modeling therefore involves developing mathematical models that are applied to
extrapolate and generate events based on sample data of those specific events. Numerous Stochastic models
have been developed to extrapolate different aspects of hydro-meteorological events including mean annual run-
off, mean annual rainfall, temperature, stream flow, groundwater, soil moisture and wind. Shamir, et al (2007)
developed stochastic techniques to generate input data for modeling small to medium catchments. Furrer and
Katz (2008) studied generation of extreme, stochastic rainfall events. The general practice in stochastic analysis
of hydro-meteorological events including extreme mean annual rainfall has been to assume a probability
distribution function that is then applied in the analysis. The focus of this study was to develop model
identification criteria for selecting the best fit probability distribution functions for frequency analysis of
extreme annual rainfall events and apply the developed criteria to identify the best fit models. The identified
best fit models were applied in modeling the distribution of extreme mean annual rainfall events in Sabie river
catchment which is one of water deficit catchments in South Africa.
C. Methods of identifying best fit probability distribution functions for extreme mean annual rainfall events.
In developing magnitude – return period models for frequency analysis of extreme mean annual rainfall
events like other extreme hydro meteorological events, it is necessary to identify probability distribution
functions that best fit the extreme event data. The methods for identifying best fit probability distribution
functions that have been applied include: maximum likelihood method (Merz and Bloschl, 2005, Willems et al.,
2007), L-moments based method (Hosking et al,. 1985, Hosking 1990), Akaike information criteria
(Akaike,1973) and Bayesian information criteria.(Schwarz,1978). The reliability of identifying the best fit
models for hydro-meteorological frequency analysis has had some criticisms. Schulze (1989), points out that
because of general short data, non homogeneity and non stationary of sample data, extrapolation beyond the
record length of the sample data may give unreliable results.
Boven (2000) has outlined the below listed limitations of applying probability distribution functions in modeling
extreme of hydro-meteorological events e.g. floods:-
The best fit probability distribution function of the parent events is unknown. And different probability
distribution functions may give acceptable fits to the available data. Yet extrapolation based on these
probability distribution functions may result in significantly different estimates of the design events.
The fitted probability distribution function does not explicitly take into account any changes in the
runoff generation processes for higher magnitude event
Apart from the above criticisms a gap still remains in that a specific criteria to identify best-fit PDFs for extreme
hydro-meteorological events that is universally accepted in South Africa has not been developed ( Smither and
Schulze, 2003).
This gap is addressed in this study by developing two model identification criteria based on data from Sabie
river catchment. Sabie river catchment is water deficit catchment.
D. Probability Distribution functions for frequency analysis of extreme hydro- meteorological events in South
Africa.
Log-Pearson 3 (LP3) probability distribution function has been recommended for design hydro-
meteorological events mostly flood and drought in South Africa (Alexander, 1990, 2001). Gorgens (2007) used
both the LP3 and General Extreme Value distribution and found the two models suitable for frequency analysis
3. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
36
of extreme hydro meteorological events in South Africa. However, Mkhandi et al. (2000) found that the Pearson
Type 3 probability distribution function fitted with parameter by method of PWM to be the most appropriate
distribution to use in 12 of the 15 relatively homogenous regions identified in South Africa (Smithers-J.C,
2002). Cullis et al, (2007) and Gericke (2010) have specifically recommended for further research in developing
the methodologies of determining best fit PDFs for frequency analysis of extreme hydro-meteorological events
in South Africa (Smithers, J.C, 2002). In this study two model identification criteria were developed to address
the problem of identifying best fit probability distribution functions for frequency analysis of extreme hydro
meteorological events specifically extreme mean annual rainfall events in water deficit catchments in South
Africa.
E. Uncertainty associated with modeling of extreme mean annual rainfall events.
There is inherent uncertainty in modeling extreme mean annual rainfall like other extreme hydro-
meteorological events. Yen et al. (1986) identified 5 classes of uncertainties. The classes are:-
Natural uncertainty due to inherent randomness of natural process
Model uncertainty due to inability of the identified model to present accurately the system’s true
physical behavior
Parameter uncertainty resulting from inability to quantify accurately the model inputs and parameters
Data uncertainty including measurement errors and instrument malfunctioning
Operational uncertainty that includes human factors that are not accounted for in modeling or design
procedure.
Yue-Ping (2010) quotes Van Asselt (2000) to have classified uncertainty based on the modeler’s and decision
makers’ views. In this case, the two classes are model outcome uncertainty and decision uncertainty. The aim of
statistical modeling of extreme hydro-meteorological events is to reduce the degree of uncertainty and the risks
associated to acceptable levels.
F. Limitations inherent in presently applied model selection criteria
The common practice in frequency analysis of extreme hydro meteorological events like extreme mean
annual rainfall has been that the modeler chooses a model for frequency analysis or chooses a set of models
from which he identifies the best-fit model to be applied for the frequency analysis (Laio et al., 2009). The
following limitations have been identified in this approach which include among others the following:
Subjectivity: - This limitation arises from the fact that there is no consistent universally accepted
method of choosing a model to be applied or a set of candidate models from which the best fit can be
identified for frequency analysis. In other words the choice depends on the experience of the modeler,
and therefore subjective;
Ambiguity: - The limitation arises from the fact that two or more models may pass the goodness-of-fit
test. Which one to choose for analysis then leads to ambiguity (Burnham and Anderson, 2002); and
Parsimony: - The case of parsimony is when the identified model mimics the sample data applied as
frequency analysis rather than the trend of the variable under consideration.
Jiang (2014) has further outlined limitations of presently applied model selection criteria as:
Effective sample size: This arises from the fact that the size of the sample n may not be equal to data
points because of correlation.
The dimension of the model: This limitation is due to the fact that the number of parameters can affect
the model fitting process.
The finite-sample performance and effectiveness of the penalty: These limitations are due to the fact
that the penalty chosen may be subjective: and
The criterion of Optimality: This limitation is due to the fact that in the present criteria, practical
considerations for instance economic or social factors are hardly included.
The limitation of parameter under and over fitting to models has also been cited in the current model
selection criteria. An attempt was made to address the above outlined limitations in developing the two model
selection criteria.
II. METHODS
Developing two model identification criteria and applying the developed criteria for frequency analysis
of extreme low mean annual rainfall events in 8 rainfall zones in Sabie secondary catchment was carried out in 4
stages
1. Developing CMIC for identifying candidate model.
4. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
37
2. Developing LSSMSC for identifying the best fit models from the candidate models.
3. Developing frequency analysis models based on the identified best fit models.
4. Developing QT-T models for each extreme low mean rainfall event sample.
A. Development process of CMIC
The development of CMIC was made in two steps. Step 1 was based on classification of PDFs and
bound characteristics of upper and lower tail events of the distributions of the sample data of the extreme low
mean annual rainfall. The step 2 was based on set significance levels in hypothesis testing as explained below.
1) Bounds classification and tail events characteristics of PDFs: The step 1 of development of CMIC criterion
was based on the classification of continuous probability distribution functions and bound characteristics of
upper and lower tail events of the distributions of sample data. Continuous probability distribution functions can
be divided into four classes: Bounded, Unbounded, Non-Negative and Advanced (MathWave, 2011). This
division is based on their upper and lower tail events characteristics and their functionality. The lower and the
upper tail events of the samples were applied to develop CMIC because the extreme characteristics of a sample
are expressed in the spread of tail events. Probability distribution function of events in any sample of any
variable has two bounds; upper and lower tail events. MathWave (2011) has proposed three possible
characteristics of any tail events bound. These are; - unknown, open and closed. These characteristics were
adopted.
The rationale behind the three characteristics can be illustrated as follows: Let sample M of variable X
be made of events X1, X2………..Xn-1, Xn arranged in ascending order and the sample size is n. If the sample
events X1 and Xn are defined and known, then the sample comes from a parent population of frequency
distribution functions with both lower and upper tails bounded and the tails are closed. These distributions are
called bounded with closed tails (MathWave, 2011). If X1 and Xn events of M are undefined with unknown
value, then the sample belongs to a parent population of frequency distribution which is unbounded with
unknown or open tails. If X1 and Xn of the sample are positive, then the sample belongs to a population of
frequency distribution with end tails which can be non-negative, unknown, open or closed. The sample data that
does not belong to any of the above groups, belongs to populations with advanced distribution functions
(MathWave, 2011). Summary of distribution bounds is given in Table I.
Table I: Bound Classifications and Tail Characteristics of Distribution Functions
Upper
Lower
Unknown Open Closed
Unknown Bounded
Unbounded
Non-Negative
advanced
Unbounded
Non-Negative
Advanced
Bounded
Advanced
Open Unbounded
Advanced
Unbounded
Advanced
Advanced
Closed Bounded
Non-Negative
Advanced
Non-Negative
Advanced
Bounded
Advanced
Source: (MathWave, 2011)
The CMIC criterion for identifying sample specific candidate models for frequency analysis was based on Table
1. The development of CMIC involved the following basics:
Let the sample M of variable X be made of events:
2.1
If events are arranged in ascending order, then
2.2
and if is the smallest numerical value event and forms the last event of lower tail of frequency distribution of
and is the largest numerical value event and forms the last event of upper tail of frequency distribution
of , then in this case, both upper and lower tails of are bounded, defined and closed, therefore the
candidate models for frequency analysis of are bounded and advanced with closed tails. (Table1). There are
two extreme scenarios in this method. Scenario one arises when numerical values X1 and Xn of a specific sample
events are unknown and undefined. In this case all available continuous probability distribution functions are
5. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
38
candidate models. These models are bounded, unbounded, non- negative and advanced (Table I). The other
scenario is when the numerical values of X1 and X n are identified and defined numerical values. In this case
bounded and advanced continuous probability distribution functions are the candidate models. The step 1
procedure led to identification of the initial candidate models for frequency analysis of low and high extreme
mean rainfall sample data events.
2) Hypothesis testing and significance levels: This was step 2 of development of CMIC. For initial candidate
models identified in Step 1 for each sample data, hypothesis testing at significance levels 0.2. 0.1, 0.05, 0.02 and
0.01was carried out.
The goodness of fit tests adopted for this study were: Kolomogrov-Smirnov, Anderson-Darling and
Chi-Square. The reasons for applying the 3 specific goodness of fit tests are outlined in section C.
The null and the alternative hypothesis for each test were: -
HO. The sample data was best described by the specific probability distribution function
HA. The sample data was not best described by the specific probability distribution function.
The hypothesis testing was applied to fence-off the final candidate models from the initial candidate
models identified by bounds method. Models from initial candidate models that were rejected in hypothesis
testing at any significance levels of 0.2, 0.1, 0.05, 0.02 and 0.01 by any of the three goodness-of-fit tests were
dropped from the candidate models.
B. Development of Least Sum of the Statistic Model Identification Criterion (LSSMIC).
1) Introduction: The Least Sum of the Statistic Model Selection Criterion (LSSMSC) was developed by
determining the Least Sum of Statistics of goodness of fit of the tests: - Kolmogorov-Smirnov, Anderson-
Darling, and Chi-Square. The mathematical principle of each of the tests on which development of LSSMSC
was based is briefly discussed below:
2) Kolmogorov-Smirnov statistic : Kolmogorov-Smirnov statistic is based on uniform law of large
numbers which is expressed in Glivenko-Cantelli Theorem (Wellener, 1977).
The theorem can be summarized as:
2.3
In this case is the cumulative distribution function, is the empirical cumulative distribution function
defined by:
2.4
where are i.i.d. with distribution and
The uniformity of this law for large numbers can be explained as:
2.5
2.6
where is the empirical distribution that assigns to each .
The law of large numbers indicates that for all
According to GLivenko–Cantelli theorem (Wellner 1977), this happens uniformly over . Applications of
Kolmogorov-Smirnov statistic as an index in determining the best-fit model among the candidate models was
based on this theorem. The model with the Least Kolmogorov-Smirnov statistic was taken as the best fit model
for this test since as in equation 2.6.
3) Anderson-Darling statistic : Anderson-Darling statistic is an index of goodness-of–fit test. In this
case the Anderson-Darling goodness-of–fit test is the comparison of empirical distribution function
assumed to be the parent distribution that is the distribution being fitted to sample data.
The hypothesis is
Ho: 2.7
The hypothesis is rejected if; is very different from
6. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
39
The difference between and is defined by:-
2.8
where is a weight function
For a given variable and a distribution to be fitted to samples of events of , the random variable
has a binomial distribution with probability . (Anderson,1952). The expected value of
is and the variance is
Since the objective was to identify best fit models for frequency analysis of extreme events, the emphasis was
put into upper and lower tails of the models in this case
, 2.9 (a)
then specifically for extreme mean rainfall events
= 2.9 (b)
In equation 2.8 if the mean is 0 and the variance 1, then and this leads to the Anderson-
Darling statistic:-
2.10
Equation 2.10 can be re-arranged and be written as
2.11
where and
is the ordered sample
From equation 2.10 and 2.11, it can be concluded that the model with the least value of among the candidate
models is the best fit.
4) Chi-Squared statistic : Applying as an index in determining the best-fit-model of extreme low
mean annual rainfall events from candidate models was based on the interpretation of as described in the
section that follows.
The value of indicates that the discrepancy between the empirical distribution function and
the model being fitted to sample data is in accord with the error of variance. Therefore the best fit
model among the candidate models is the one with the least discrepancy. Absolute (abs was
adopted in the study
abs was defined as:
2.12(a)
2.12(b)
where number of degrees of freedom given by
is the expected frequency in the corresponding bin
is the observed frequency in each bin, Number of observations
Number of fitted observations , Number of candidate models
C. Elements of LSSMSC
To develop an alternative but simple model selection criterion, advantages of , and were
intergrated. The intergration of the three goodness of fit tests led to Least Sum of Statistic Model Selection
Criterion ( LSSMSC). The definition of the developed LSSMSC is:
7. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
40
2.13
where Kolomogrov-Smirnov statistic
Anderson-Darling statistic
Absolute reduced Chi-Square
= number of models
D. Application of CMIC and LSSMIC
The developed CMIC and LSSMSC criteria were applied in Easy fit v5.5 software to identify candidate
and best fit models for the frequency analysis of the extreme mean annual events. Mean annual rainfall data for
the period 1920-2004 for each of the 8 rainfall zones was obtained from Water Research Commission Pretoria.
Easy fit V5.5 was chosen for this study because of the following features (mathwave,2011):
1) It supports more than 50 continuous and discrete probability distribution functions.
2) It has powerful automated fitting model combined with flexible manual fitting capabilities.
3) It carries out goodness of fit tests
4) It has capability of generating random numbers
5) It is easily applied to user interface
6) There is comprehensive technical assistance from the developers (Mathwave,2011)
Other scientific features which make Easy Fit V.5.5 include
o It can be applied to analyze large data sets (up to 250,000 data points)
o It includes application of advanced distributions to improve the validity of probability distribution
functions
o It can be applied to calculate descriptive statistics
o It organizes data and analyzes results into project files.
1) Application of points below threshold (PBT) Model: Points below threshold (PBT) model was applied to
identify extreme low mean annual rainfall events for each of the 8 rainfall zones. For each of the 8 rainfall
zones, the mean of mean annual rainfall values for the period 1920-2004 was chosen as the threshold for that
particular rainfall zone. The mean rainfall events less than the threshold formed the extreme low mean annual
rainfall events. The mean of mean annual rainfall values of each rainfall zone was chosen as the threshold
because annual rainfall less than the mean leads to agricultural drought.
E. Development of QT-T models
Based on the identified best fit model for frequency analysis extreme low mean annual rainfall events
for each of the 8 rainfall zones, QT-T models were developed. The developed QT-T models were applied to
extrapolate and estimate extreme low mean annual rainfall events for return periods of 5, 10, 25, 50, 100 and
200 years for each of the 8 rainfall zones. The methods of parameter estimations applied were: methods of
moments, and maximum likelihood.
F. Assessment of performance of CMIC and LSSMIC
The performance of CMIC and LSSMIC as candidate and best fit models for frequency analysis of
extreme mean annual rainfall events was carried out by applying probability-probability (P-P) in EasyFit 5.5
software. EasyFit 5,5 software displays reference graphic plots. The closeness of the candidate model plot to the
reference graphic plots was the mode of the assessment applied. This was visual assessment.
III. RESULTS
A. Descriptive statistics of extreme low mean annual rainfall for 8 rainfall zones.
The descriptive statistics of events of the 8 samples of extreme low mean annual rainfall are presented
in table II. The descriptive statistics were applied in developing QT-T models for frequency analysis of the
extreme low annual rainfall events in each rainfall zone. The skewness and excess kurtosis of each sample have
been applied in describing the symmetry and the peakedness of the frequency distributions of these samples.
8. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
41
Table II: Descriptive statistics
Rainfall zones X3a1 X3a2 X3b X3c X3d1 X3d2 X3e X3f
Statistic Value Value Value Value Value Value Value Value
Sample Size 50 47 49 51 51 47 52 50
Range 38.88 41.03 50.2 52.63 59.39 55.28 49.26 50.01
Mean 85.49 83.36 82.96 82.65 82.89 80.9 80.9 82.06
Variance 96.92 107.6 126.1 173.3 156.2 149.3 165.5 162.4
Std. Deviation 9.845 10.37 11.23 13.16 12.5 12.22 12.86 12.74
Coef. of Variation 0.1152 0.1245 0.1354 0.1593 0.1508 0.151 0.159 0.1553
Std. Error 1.392 1.513 1.604 1.843 1.75 1.782 1.784 1.802
Skewness -0.549 -0.460 -0.923 -1.043 -1.14 -0.865 -0.764 -0.694
Excess Kurtosis -0.454 -0.428 0.569 0.784 1.944 0.738 -0.230 -0.200
The skewness value of each of the 8 samples of extreme mean annual rainfall is less than zero ie
negative. This means that the frequency distribution of events in each of these samples is left skewed. Most of
the events are concentrated on the right of the mean with extreme events to the left. The excess kurtosis value of
each of the 8 samples is less than 3. This means that the frequency distribution of each sample is platykurtic
with peak flatter and wider than that of normal distribution.
B. Initial candidate models
The results of applying CMIC to each of the 8 data samples of extreme low mean annual rainfall are
presented in table II. To obtain these results, method described in section B was applied. The results showed that
all the 8 samples had similar candidate models presented in table II.These candidate models are: Beta,
Generalised Extreme Value, Generalised Logistic, Generalised Pareto, Johson SB, Kamaraswany, Log-Pearson
3, Piet, Phased Bi- Webull, Power function, Recipricol, Triangular, Uniform and Wakeby.
C. Final candidate models
Step 2 of the CMIC described in section B was applied to sets of the initial identified candidate models
given in table II. The results are presented in tables 3.3-3.5. The results showed that the final candidate models
for frequency analysis of extreme low mean annual rainfall events in each of the 8 rainfall zones were:
Generalised Logistic (GL), Log-Pearson 3 (LP3), and Generalised Extreme Value (GEV). The CDFs and PDFs
of the identified candidate models are presented in table VII.
Generalised Logistic, Log-Pearson 3 and Generalised Extreme Value frequency analysis models were
identified as the final candidate models because none of these models was rejected at significance levels;
0.2,0.1, 0.02 and 0.01 in the three goodness of fit test in any of the 8 samples of extreme low mean rainfall
events. The results of goodness of fit tests are presented in tables IV-VI
Table III: The identified initial candidate models
Distribution Kolmogorov
Smirnov
Anderson
Darling ( A2
n)
Chi-Squared
)
Statistic Rank Statistic Rank Statistic Rank
Beta 0.0731 3 2.155 7 2.56 5
Gen. Extreme Value 0.0795 5 0.2345 3 2.898 7
Gen. Logistic 0.102 10 0.5311 5 2.863 6
Gen. Pareto 0.0894 7 11.49 12 N/A
Johnson SB 0.0704 1 0.168 1 4.18 9
Kumaraswamy 0.0718 2 2.153 6 2.56 4
Log-Pearson 3 0.0931 8 0.3815 4 1.27 1
Pert 0.1366 12 4.782 10 6.4 10
Phased Bi-Exponential 0.9992 15 512.6 15 N/A
Phased Bi-Weibull 0.4529 14 52.26 14 N/A
Power Function 0.0871 6 2.216 8 1.78 2
Reciprocal 0.2855 13 9.87 11 16.18 11
Triangular 0.1047 11 2.431 9 2.32 3
Uniform 0.0954 9 15.01 13 N/A
Wakeby 0.0748 4 0.2058 2 3.774 8
9. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
42
Tables IV-VI Identified final candidate models
Table IV: Generalised Extreme Value distribution
Gen. Extreme Value
Kolmogorov-Smirnov ( )
Sample Size
Statistic
P-Value
Rank
47
0.0815
0.8886
1
0.2 0.1 0.05 0.02 0.01
Critical Value 0.153 0.1748 0.1942 0.2172 0.233
Reject? No No No No No
Anderson-Darling (A2
n )
Sample Size
Statistic
Rank
47
0.34
1
0.2 0.1 0.05 0.02 0.01
Critical Value 1.375 1.929 2.502 3.289 3.907
Reject? No No No No No
Chi-Squared ( )
Deg. of freedom
Statistic
P-Value
Rank
4
4.17
0.3836
2
0.2 0.1 0.05 0.02 0.01
Critical Value 5.989 7.779 9.488 11.67 13.28
Reject? No No No No No
Table V: Generalised Logistics distribution
Gen. Logistic
Kolmogorov-Smirnov (
Sample Size
Statistic
P-Value
Rank
47
0.1055
0.6343
3
0.2 0.1 0.05 0.02 0.01
Critical Value 0.153 0.1748 0.1942 0.2172 0.233
Reject? No No No No No
Anderson-Darling (A2
n)
Sample Size
Statistic
Rank
47
0.5878
3
0.2 0.1 0.05 0.02 0.01
Critical Value 1.375 1.929 2.502 3.289 3.907
Reject? No No No No No
Chi-Squared (
Deg. of freedom
Statistic
P-Value
Rank
4
3.292
0.5102
1
0.2 0.1 0.05 0.02 0.01
Critical Value 5.989 7.779 9.488 11.67 13.28
Reject? No No No No No
10. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
43
Table VI: Log-Pearson distribution
Log-Pearson 3
Kolmogorov-Smirnov ( )
Sample Size
Statistic
P-Value
Rank
47
0.0899
0.8093
2
0.2 0.1 0.05 0.02 0.01
Critical Value 0.153 0.1748 0.1942 0.2172 0.233
Reject? No No No No No
Anderson-Darling ( A2
n)
Sample Size
Statistic
Rank
47
0.3954
2
0.2 0.1 0.05 0.02 0.01
Critical Value 1.375 1.929 2.502 3.289 3.907
Reject? No No No No No
Chi-Squared (
Deg. of freedom
Statistic
P-Value
Rank
5
4.33
0.5029
3
0.2 0.1 0.05 0.02 0.01
Critical Value 7.289 9.236 11.07 13.39 15.09
Reject? No No No No No
Table VII: Candidate models for modeling the distribution of extreme low mean annual rainfall events in 8
rainfall zones in Sabie river catchment
Distribution CDF or PDF Domain
Generalized Extreme
Value
Generalized Logistics
Log-Pearson 3
D. Results of identification of best-fit models for frequency analysis of extreme low mean annual rainfall
events in Sabie river catchment
The best fit models for frequency analysis of extreme low mean annual rainfall events in 8 rainfall
zones in Sabie river catchment are presented in tables VII-XV. The summary of the results of the best fit model
identification is presented in table XVI.
Table VIII: Best fit model : Zone X3 a1
Distribution Kolmogorov
Smirnov
Anderson
Darling
A2
n
Chi-Squared Abs(1- LSSMIC Best fit
Statistic Statistic Statistic
Gen. Extreme Value 0.079 0.234 2.898 1.898 2.211
Gen. Logistic 0.102 0.531 2.863 1.863 2.496
Log-Pearson 3 0.093 0.381 1.270 0.270 0.744 X
11. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
44
Table IX: Best fit model : Zone X3a2
Distribution Kolmogorov
Smirnov
Anderson
Darling
A2
n
Chi-Squared ( Abs (1- LSSMIC Best fit
Statistic Statistic Statistic
Gen. Logistic 0.105 0.587 3.292 2.292 2.984 x
Gen. Extreme Value 0.081 0.340 4.170 3.170 3.591
Log-Pearson 3 0.089 0.395 4.330 3.330 3.814
Table X: Best fit model : Zone X3b
Distribution Kolmogorov
Smirnov
Anderson
Darling A2
n
Chi-Squared ( Abs (1- LSSMIC Best fit
Statistic Statistic Statistic
Gen. Logistic 0.073 0.302 2.653 1.653 2.028 x
Johnson SB 0.089 0.241 5.127 4.127 4.457
Log-Pearson 3 0.077 0.273 7.055 6.055 6.405
Table XI: Best fit model : Zone X31c
Distribution Kolmogorov
Smirnov
Anderson
Darling(
A2
n)
Chi-
Squared
(
Abs. (1- LSSMIC Best
fit
Statistic Statistic Statistic
Gen. Extreme Value 0.042 0.133 0.350 0.649 0.824
Log-Pearson 3 0.059 0.248 0.866 0.133 0.440 x
Gen. Logistic 0.069 0.286 2.065 1.065 1.420
Table XII: Best fit model : Zone X3d1
Distribution Kolmogorov
Smirnov (
Anderson
Darling
(A2
n)
Chi-Squared
(
Abs. ( 1- LSSMIC Best fit
Statistic Statistic Statistic
Wakeby 0.088 1.362 3.262 2262 3.712
Gen. Logistic 0.106 0.516 3.950 2.950 3.572 x
Gen. Extreme Value 0.097 0.446 5.923 4.923 5.466
Table XII: Best fit model : Zone X3d2
Distribution Kolmogorov
Smirnov
Anderson
Darling
(A2
n)
Chi-Squared
(
Abs. (1- LSSMIC Best fit
Statistic Statistic Statistic
Gen. Extreme Value 0.054 0.118 0.799 0.201 0.373 x
Gen. Logistic 0.082 0.191 1.334 0.334 0.607
Log-Pearson 3 0.055 0.124 1.553 0.553 0.732
Table XIV: Best fit model : Zone X3e
Distribution Kolmogorov
Smirnov
Anderson
Darling
A2
n
Chi-Squared
(
Abs.(1- LSSMIC Best fit
Statistic Statistic Statistic
Gen. Extreme Value 0.088 0.461 2.029 1.029 1.578 X
Gen. Logistic 0.107 0.771 3.415 2.415 3.293
Log-Pearson 3 0.125 0.770 3.432 2.432 3.327
12. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
45
Table XV: Best fit model : Zone X3f
Distribution Kolmogorov
Smirnov
Anderson
Darling (
A2
n)
Chi-
Squared
(
Abs. ( 1- LSSMIC Best
fit
Statistic Statistic Statistic
Log-Pearson 3 0.072 0.401 1.244 0.244 0.717 X
Gen. Extreme Value 0.057 0.265 1.843 0.843 1.165
Gen. Logistic 0.092 0.560 4.527 3.527 4.179
Table XVI: Summary of best fit model identification results.
Best fit model
Log-Pearson 3
Rainfall zones %
3 37.5
Generalized Logistic 3 37.5
Generalized Extreme Value 2 25
Totals 8 100
Log- Pearson 3 and Generalised Logistic models each was the best fit models for frequency analysis of
extreme low mean annual rainfall events in 3 rainfall zones. Generalised extreme value model was the best fit in
2 rainfall zones. The results shows that there no single model which is the best fit for frequency analysis of all
low mean annual rainfall events in Sabie river catchment.
Table XVII: Best fit and QT-T models for frequency analysis of extreme low mean annual rainfall events
Rainfall
zone
Best fit
model QT-T model
X3a1 LP 3
X3a2 GL
X3b GL
X3c LP 3
X3d1 GL
X3d2 GEV
X3e GEV
X3f LP 3
E. Results of frequency analysis of extreme low mean annual rainfall events in 8 rainfall zones in water
deficit catchments in South Africa
The results of frequency analysis of extreme low mean annual rainfall events in the 8 zones of water
deficit catchments in South Africa are presented in table XVIII.
13. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
46
Table XVIII: Recurrence intervals in years of extreme low mean annual rainfall events
F. How CMIC and LSSMIC address common limitations of model identification criteria
General Information Criteria (GIC) can be expressed as:
3.1
where is the measure lack of fit by model
is the dimension of , defined as the number of free parameter under model
is the penalty for complexity of the model (Parsimony)
may depend on effective sample size and dimension of (free parameters)
Addressing of limitations cited by Jiang (2014) by developing CMIC and LSSMIC was based on Equation 3.1.
Emanating from this work and based on the results obtained thereof, following can be considered with respect to
the previous cited limitations with the current model selection criteria:
Limitation of effective sample size
In the case where the sample size is not equal to sample points, this limitation is addressed by identifying
candidate models and best models based on characteristic of sample probability tail events and hypothesis
significance levels. Sample distribution tail shape and not size is applied. In so doing, the effect of sample size is
eliminated. Kolomogrov-Smirnov goodness-of-fit index is a component of LSSMIC. This index is generally not
influenced by the possible inequality of sample size against the sample points especially in case of correlations.
Limitation due to dimension of the model (parameters)
For practical purposes as illustrated in previous sections, the Anderson-Darling goodness-of-fit index has been
adjusted to address the complexity of the problem (Refer to Equation 2.11). Chi-Squared goodness-of-fit was
also adjusted to absolute index to address the limitation of dimension of the model (Refer to Equation 2.12(a).).
Over and under parameter fitting limitation was also addressed in developing of absolute Chi-Squared index
(Refer to Equation 2.12(b). In this procedure parsimony was also addressed. Both Anderson-Darling and Chi-
Squared indices are elements in LSSMIC.
Limitation of ambiguity
Determination of LSSMIC index results into specific numbers which in turn reduces ambiguity. A special case
is when parameters shape or scale of Wakeby model is equal to zero. In that case Wakeby and Generalized
Pareto statistics are equal and if are the least, then the two models are taken as the best fit.
Rainfall
zone
Best-
fit
Mathematical model
Recurrence interval in years
5 10 25 50 100 200
X3a1 LP3 77.06 73.18 69.84 67.76 66.24 64.7 6
X3a2 GL 99.12 98.55 97.90 97.49 97.08 96.71
X3b GL 93.62 90.30 86.91 84.84 83.06 81.52
X3c LP3 71.67 68.23 66.78 64.15 62.90 61.54
X3d1 GL 92.60 88.91 84.59 82.00 79.73 77.72
X3d2 GEV 87.51 84.17 81.60 80.44 79.68 78.99
X3e GEV 86.27 83.17 80.93 79.99 79.40 79.03
X3f LP3 71.20 68.01 63.07 60.45 59.12 57.82
14. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
47
Limitation of Criterion of Optimality
CMIC and LSSMIC have been developed in such a way that other parameters can be included. This approach
ensures that technical and social parameters or variables can be included when needed. In this particular case
peaks under threshold models have been included. This could practically be assigned to the demand for water
resources needed at catchment level.
Limitation of small sample and extreme events
This limitation is deemed to be solved by including Anderson-Darling and Kolomogrov-Smirnov in LSSMIC as
both cater for small sample and extreme events scenarios.
IV. CONCLUSION
The main objective of this study was to develop two model identification criteria for frequency
analysis of extreme low mean rainfall events in water deficit catchments in South Africa. The two model
selection criteria which were developed are:
1. Candidate Model Identification Criterion (CM1C) for identifying candidate models.
2. Least Sum of Statistics Model Identification Criterion ( LSSM1C) for identifying the best fit models for
frequency analysis of the extreme low mean annual rainfall events from the identified candidate models.
The two developed criteria were applied to identify candidate models and best fit models for frequency analyses
of the extreme low mean annual rainfall events in 8 rainfall zones in Sabie river catchment. Results obtained
showed that there no single probability distribution function is the best fit for all the 8 rainfall zones (table XVI).
Log-Pearson 3 (LP3) probability distribution function has been recommended for design hydro-
meteorological events mostly flood and drought in South Africa (Alexander, 1990, 2001). From the results of
this study, Log-Pearson 3 may not be the only best fit model for frequency analyses of extreme hydro-
meteorological events in South Africa. It is therefore important to carry out model identification processes to
identify best fit model for the specific required frequency analysis.
Probability-Probability (P-P) plots were applied to evaluate the performance of CMIC and LSSMISC.
The plots showed that CMIC and LSSMIC performed fairly well. Although the Probability-Probability (P-P)
plot results cannot be considered completely conclusive, CMIC and LSSMSC criteria make useful tools as
model selection method for frequency analysis of extreme mean annual rainfall events.
ACKNOWLEDGEMENT
The study has been supported by the Provincial Dept. of Agriculture, Rural Development, Land and
Environmental Affairs: Mpumalanga Provincial Government, South Africa.
REFERENCES
[1]. Akaike, H., 1974.`` A new look at the statistical model identification.’’IEEE Transaction on automatic
control vol. 6 pp 716-725.
[2]. Alexander, W.J.R., 1990. Flood Hydrology for Southern Africa. SANCOLD, Pretoria, RSA.
[3]. Alexander, W.J.R., 2001. Flood Risk Reduction Measures. University of Pretoria, Pretoria, RSA.
[4]. Alexander, W.J.R., 2002a. The standard design flood. Journal of the South Africa Institution of Civil
[5]. Engineers, 44 (1):26-31.
[6]. Anderson, T.W., Darling, D.A., 1952. Asymptotic theory of certain goodness-of-fit criteria based on
[7]. stochastic processes. Ann.Mat.Stat. 23, 193-212.
[8]. Anderson,D., R., and Burnham, K., P., 2002: Avoiding pitfalls when using information-theoretic
methods: Journal of wildlife management 66(3): 912-918.
[9]. Beven, K., J., 2000 Rain-Run-off modeling: The Primer. John Wiley and sons. Chichester, UK, 360,
360pp
[10]. Bobee, B., and Rasmussen, P., F., 1995 Recent advances in flood frequency analysis. : Reviews of
Geophysics, supplement, July 1111-1116.
[11]. Bodini, A., and Cossu, Q., A., 2010 Vulnerability assessment of Central-East Sardinia ( Italy) to
extreme rainfall events. Journal of Natural hazards and Earth systems sciences. 10. Pp 61-77
[12]. Claeskens, G., Hjort, N.L., 2003. The focused Information Criterion: Journal of the American statistical
[13]. Association. 98, 900-916.
[14]. Feyen,H.M.J., 1970 . Identification of run-off processes in catchments with a small scale topograph.
[15]. Furrer, E., M., and Katz, R., W., 2008: ``Improving the simulation of extreme precipitation events by
stochastic weather generators’’. Water Resources Research 44, W 12439doi. 1029/2008 WR 007316,
2008.
[16]. Gericke, O.,J., 2010 Evaluation of the SDF method using a customized Design Flood Estimation Tool.
University of Stellenbosch, Stellenbosch, South Africa 480 pp.
15. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
48
[17]. Gorgens, A., H., M., Joint Peak- Volume ( JPV) Design Flood Hydrographs for South Africa. WRC
Report 1420/3/07 Water Research Commission, Pretoria, South Africa.
[18]. Gorgens, A., H., M., Lysons,S., Hayes, L., Makhabana, M., and Maluleke, D., 2007 Modernised South
African Design Floo Pro actce in the context of Dam safety, WRC Report 1420 / 2 / 07, Water
Research omission, Pretoria, South Africa.
[19]. Goswani, P., and Ramesh, K.., V., 2007. Extreme rainfall events: Vulnerability Analysis for Disaster
Management and Observation Systems Design. Research Report CSIR PP21.
[20]. Hernandez, A., R., P., Balling R., C., Jr. and Barber-Matinez 2009. Comparative analysis of indices of
extreme rainfall events: Variation and trend from Southern Mexico. Journal de Atmosfera 2 pages 219-
228
[21]. Hosking, J.R.M., 1990. L-moments: Analysis and estimation of distribution using Linear combinations
of
[22]. order statistics: Journal of the Royal statistics society Series B. 52, 105-124.
[23]. Hosking, J.R.M., Wallis, J.R., Wood, E.F. 1985. Estimation of the generalized extreme-value
distribution
[24]. by the method of probability-weighted moments, Technimetrics. 27 (3) pp, 251-261.
[25]. Hjort, N.L., Claeskens, G., 2003. Rejoinder to the focused.
[26]. Jiang, J., 2014. The fence method: Advances in statistics. Volume 2014 Article ID, 38081.
[27]. Kidson, R., Richards, K.S., Carling, P.A., 2005. Reconstructing the Ca-100 year flood in Northern
[28]. Thailand. Geomorphology 70 (2-4), 279-595
[29]. Kottegoda, N.T., Rosso, R, 1998. Statistics Probability and Reliability for Civil and Environmental
[30]. Engineers. McGrawHill, New York.
[31]. Laio, et.al., 2009. Model selection techniques for the frequency analysis of hydrological extremes
water
[32]. Research. Vol.45, W07416.
[33]. Laio, F., 2004. Cramer-Von Mises and Anderson-Darling goodness-of-fit tests for extreme value
[34]. distributions with unknown parameters Water Resources Research. 40, W09308,
[35]. doi:10.1029/2004WR003204
[36]. Mathwave(2011), http://www.mathwave.com.
[37]. Mallows, C.L., 1995. Some comments on Cp: Technimetrics. 37, 362-372.
[38]. Mallows, C.L., 1973. Some comments on Cp: Technimetrics. 15, 661-675.
[39]. Meiz, R., and Bloschl, G., 2004 Regionaliisation of catchment model parameters. J. Hydrol. 287 pp 95-
123.
[40]. Mkhandi, S.H., Kachroo, R.K., Gunasekara, T.A.G., 2000. Flood frequency analysis of Southern
Africa:
[41]. II. Identification of regional distributions. Hydrological Sciences Journal, 45(3):449-464.
[42]. Mutua, F.M., 1994. The use of the Akaike Information Criterion in the identification of an optimum
flood
[43]. frequency model. Hydro. Sci.J.39 (3), 235-244.
[44]. Salami, A., W., ( 2004) Prediction of the annual flow regime along Asa River using probability
distribution models. AMSE periodicals, Lyon, France. Modeling c-2004 65 (2), 41-56. ( http://amse-
modeling.org/content.amse 2004. htm)
[45]. Schwarz, G., 1978. Estimating the dimension of a model. Annals stat. 6, 461-464.
[46]. Schulze, R.,E., 1989, Non-stationary catchment responses and other problems in determing flood
series: A case for a simulation modeling approach: In: Kienzle, S., W., Maaren, H., (Eds); Proceedings
of the Fourth South African National Hyrological Symposium SANCIAHS, Pretoria, RSA pp 135-157.
[47]. Shamir,E., 2005 2004,,Application of temporal streamflow descriptors in hydrologic model parameter
estimation.Water Resource Research 41 doi: 10 : 1029/2004 WR 003409 SSN 0043-1397
[48]. Shao, J., Sitter, R.R., 1996. Bootstrip for imprinted Survey data. Journal of the American statistical
[49]. Association, 93, 819-831.
[50]. Smithers, J.C., Schulze, R.E., 2001. Design runoff estimation: A review with reference to practices in
[51]. South Africa, Proceedings of Tenth South Africa National Hydrology Symposium: 26-28 September,
[52]. 2001. School of BEEH, University of Natal, Pietermaritzburg, RSA.
[53]. Smithers, J.C., Schulze, R.E., 2003. Design rainfall and flood estimation in South Africa.WRC Report
No.
[54]. 1060/01/03, Water Research Commission, Pretoria, RSA.
[55]. Stedinger, J.R., Vogel, R.M., Foufoula, E., 1993. Frequency analysis of extreme events, in Handbook
of
[56]. Hydrology, D.R., McGrawHill, chap. 18, 66pp.
16. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
49
[57]. Stedinger, J.R., 1980. Fitting log normal distribution to hydrologic data. Water Resource research.
16(3)
[58]. pp, 481-490.
[59]. Stephens, M.A., 1976. Asymptotic results for goodness-of-fit statistics with unknown parameters. Ann.
[60]. Stat. Vol. 4, 357-369.
[61]. Van der Linde, A., 2005. DIC in variable selection statistical Neerlandica. 59, 45-56.
[62]. Van der Linde, A., 2004. DIC in variable selection Technical report, Institute of statistics, University of
[63]. Bremen, http://www.math.uni-bremen.de/$simsard1/downloadpapersvarse12.pdf.
[64]. Wellner J.A., 1977. Aglirenko-Cantelli. Theorem and strong laws of large numbers for function of
order
[65]. statistics: The annual of statistics vol.5 number 3. 1977, 473-480.
[66]. Wilks, D.,S., 1998 Multi-site generalization of daily stochastic precipitation model, J. Hydro 210 178-
191.
[67]. Willens, P., Olsson, J., Ambjerg_Nielsen, Beecham,S., Pathirana, A., BulowGregersen, Madsen,H. and
Nguyen, V.,T.,V. 2007 Impact of climate change on rainfall extreme and urban drainage systems pp19-
20.
[68]. WMO- World Meteorological organization. 1988. Analysing long Time Series of Hydrological data
with respect of climat variability WCAP-No. 3 WMO-TD No. 224.
[69]. WMO- World Meteorological organization. 2009 Guidelines on Analysis of extremes in a changing
climate in support of informed decisions for adaptations. WCDMP-No. 72 WMO-TD No. 1500.
[70]. Yua-Ping, Xu, Booij, Martin, J., Tong, Yang_Bin 2010 Uncertainity analysis in statistical modeling of
extreme hydrological events
[71]. Yue, S., and Wang, 2004. The Mann-Kendall test modified by effective sample size to detect trend in
serially correlated hydrological series.: Journal of Water Resources Management 18 Pages 201-218.
Appendix A: Probability-Probability (P-P) plots for extreme low mean rainfall events for the
8 rainfall zones
P-P Plot Zone X3a1
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.90.80.70.60.50.40.30.20.1
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig. 1: P-P plot Zone x3a1 : Extreme mean annual rainfall events
17. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
50
P-P Plot Zone x3a2
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.90.80.70.60.50.40.30.20.1
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig. 2: P-P plot Zone x3a2 : Extreme mean annual rainfall events
P-P Plot Zone x3b
Gen. Logistic Log-Pearson 3 Wakeby
P (Empirical)
10.90.80.70.60.50.40.30.20.1
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig. 3: P-P plot Zone x3b : Extreme mean annual rainfall events
18. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
51
P-P Plot Zone x3c
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.90.80.70.60.50.40.30.20.10
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig.4: P-P plot Zone x3c : Extreme mean annual rainfall events
P-P Plot Zone x3d1
Gen. Extreme Value Gen. Logistic Wakeby
P (Empirical)
10.90.80.70.60.50.40.30.20.10
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig. 5 P-P plot Zone x3d1: Extreme mean annual rainfall events
19. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
52
P-P Plot Zone x3d2
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.90.80.70.60.50.40.30.20.1
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
FIG. 6: P-P plot Zone x3d2 : Extreme mean annual rainfall events
P-P Plot Zone x3e
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.80.60.40.20
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig. 7: P-P plot Zone x3e: Extreme mean annual rainfall events
20. Best Fit and Selection of Probability Distribution Models for Frequency Analysis of Extreme Mean Annual…
53
P-P Plot Zone x3f
Gen. Extreme Value Gen. Logistic Log-Pearson 3
P (Empirical)
10.90.80.70.60.50.40.30.20.1
P(Model)
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Fig.8: P-P plot Zone x3f : Extreme mean annual rainfall events