This document discusses using data mining techniques to build process models from full-scale plant data to optimize water and wastewater treatment processes. It provides several case studies where neural networks were used to model relationships between key process variables and contaminant levels. For example, one case study showed turbidity, color, and temperature accounted for 74% of the variability in chloroform levels. The document recommends using process models to predict contaminant levels, optimize chemical dosing, and evaluate "what if" scenarios to reduce operating costs while meeting regulations.
How CFD can help gain the most efficient data centre designsmattkeysource
Using the most powerful 3-D software available; computational fluid dynamics models the airflow within your data centre. It provides a graphic analysis of how hot and cool air flows, and Keysource use this tool as it is essential to gain the most efficient data centre designs.
Talk on Optimization for Deep Learning, which gives an overview of gradient descent optimization algorithms and highlights some current research directions.
How CFD can help gain the most efficient data centre designsmattkeysource
Using the most powerful 3-D software available; computational fluid dynamics models the airflow within your data centre. It provides a graphic analysis of how hot and cool air flows, and Keysource use this tool as it is essential to gain the most efficient data centre designs.
Talk on Optimization for Deep Learning, which gives an overview of gradient descent optimization algorithms and highlights some current research directions.
A New Approach for Design of Model Matching Controllers for Time Delay System...IJERA Editor
Modeling of physical systems usually results in complex high order dynamic representation. The simulation and design of controller for higher order system is a difficult problem. Normally the cost and complexity of the controller increases with the system order. Hence it is desirable to approximate these models to reduced order model such that these lower order models preserves all salient features of higher order model. Lower order models simplify the understanding of the original higher order system. Modern controller design methods such as Model Matching Technique, LQG produce controllers of order at least equal to that of the plant, usually higher order. These control laws are may be too complex with regards to practical implementation and simpler designs are then sought. For this purpose, one can either reduce the order the plant model prior to controller design, or reduce the controller in the final stage, or both. In the present work, a controller is designed such that the closed loop system which includes a delay response(s) matches with those of the chosen model with same time delay as close as possible. Based on desired model, a controller(of higher order) is designed using model matching method and is approximated to a lower order one using Approximate Generalized Time Moments (AGTM) / Approximate Generalized Markov Moments (AGMM) matching technique and Optimal Pade Approximation technique. Genetic Algorithm (GA) optimization technique is used to obtain the expansion points one which yields similar response as that of model, minimizing the error between the response of the model and that of designed closed loop system.
Fluid dynamics, actually is the study of fluid under motion, governed with a certain set of conservation equations, wherein things are conserved, with reference to mass, momentum & energy.
If these three quantities i.e. mass, momentum & energy are solved entirely we can define any fluid flow. The conservation laws are formulated in the form of equations which we try to solve and that’s what simulation is all about. For my blogs kindly visit: https://www.learncax.com/knowledge-base/blog/by-author/ganesh-visavale
Kalman filter is a algorithm of predicting the future state of a system based on the previous ones.
In the presentation, I introduce to basic Kalman filtering step by step, with providing examples for better understanding.
Hybrid hmmdtw based speech recognition with kernel adaptive filtering methodijcsa
We have proposed new approach for the speech recognition system by applying kernel adaptive filter for
speech enhancement and for the recognition, the hybrid HMM/DTW methods are used in this paper. Noise
removal is very important in many applications like telephone conversation, speech recognition, etc. In the
recent past, the kernel methods are showing good results for speech processing applications. The feature
used in the recognition process is MFCC features. It consists of a HMM system used to train the speech
features and for classification purpose used the DTW method. Experimental results show a relative
improvement of recognition rate compared to the traditional methods.
A New Approach for Design of Model Matching Controllers for Time Delay System...IJERA Editor
Modeling of physical systems usually results in complex high order dynamic representation. The simulation and design of controller for higher order system is a difficult problem. Normally the cost and complexity of the controller increases with the system order. Hence it is desirable to approximate these models to reduced order model such that these lower order models preserves all salient features of higher order model. Lower order models simplify the understanding of the original higher order system. Modern controller design methods such as Model Matching Technique, LQG produce controllers of order at least equal to that of the plant, usually higher order. These control laws are may be too complex with regards to practical implementation and simpler designs are then sought. For this purpose, one can either reduce the order the plant model prior to controller design, or reduce the controller in the final stage, or both. In the present work, a controller is designed such that the closed loop system which includes a delay response(s) matches with those of the chosen model with same time delay as close as possible. Based on desired model, a controller(of higher order) is designed using model matching method and is approximated to a lower order one using Approximate Generalized Time Moments (AGTM) / Approximate Generalized Markov Moments (AGMM) matching technique and Optimal Pade Approximation technique. Genetic Algorithm (GA) optimization technique is used to obtain the expansion points one which yields similar response as that of model, minimizing the error between the response of the model and that of designed closed loop system.
Fluid dynamics, actually is the study of fluid under motion, governed with a certain set of conservation equations, wherein things are conserved, with reference to mass, momentum & energy.
If these three quantities i.e. mass, momentum & energy are solved entirely we can define any fluid flow. The conservation laws are formulated in the form of equations which we try to solve and that’s what simulation is all about. For my blogs kindly visit: https://www.learncax.com/knowledge-base/blog/by-author/ganesh-visavale
Kalman filter is a algorithm of predicting the future state of a system based on the previous ones.
In the presentation, I introduce to basic Kalman filtering step by step, with providing examples for better understanding.
Hybrid hmmdtw based speech recognition with kernel adaptive filtering methodijcsa
We have proposed new approach for the speech recognition system by applying kernel adaptive filter for
speech enhancement and for the recognition, the hybrid HMM/DTW methods are used in this paper. Noise
removal is very important in many applications like telephone conversation, speech recognition, etc. In the
recent past, the kernel methods are showing good results for speech processing applications. The feature
used in the recognition process is MFCC features. It consists of a HMM system used to train the speech
features and for classification purpose used the DTW method. Experimental results show a relative
improvement of recognition rate compared to the traditional methods.
Hanjun Dai, PhD Student, School of Computational Science and Engineering, Geo...MLconf
Graph Representation Learning with Deep Embedding Approach:
Graphs are commonly used data structure for representing the real-world relationships, e.g., molecular structure, knowledge graphs, social and communication networks. The effective encoding of graphical information is essential to the success of such applications. In this talk I’ll first describe a general deep learning framework, namely structure2vec, for end to end graph feature representation learning. Then I’ll present the direct application of this model on graph problems on different scales, including community detection and molecule graph classification/regression. We then extend the embedding idea to temporal evolving user-product interaction graph for recommendation. Finally I’ll present our latest work on leveraging the reinforcement learning technique for graph combinatorial optimization, including vertex cover problem for social influence maximization and traveling salesman problem for scheduling management.
Reduced-cost ensemble Kalman filter for front-tracking problemsMélanie Rochoux
In this talk, a reduced-cost ensemble Kalman filter (PC-EnKF) is implemented for the estimation of the model input parameters in the context of a front-tracking problem. The forecast step relies on a probabilistic sampling based on a Polynomial Chaos (PC) surrogate model. The performance of the hybrid PC-EnKF strategy is assessed for synthetic front-tracking test cases as well as in the context of wildfire spread, which features a front-like geometry and where the estimation targets are the unknown biomass fuel properties and the surface wind conditions. Results indicate that the hybrid PC-EnKF strategy features similar performance to the standard EnKF algorithm, without loss of accuracy but at a much reduced computational cost.
Reference published in NHESS (2014)
➞ Rochoux, M.C., Ricci, S., Lucor, D., Cuenot, B., and Trouvé, A. (2014) Towards predictive data-driven simulations of wildfire spread. Part I: Reduced-cost Ensemble Kalman Filter based on a Polynomial Chaos surrogate model for parameter estimation, Natural Hazards and Earth System Sciences, Special Issue: Numerical Wildland Combustion, from the flame to the atmosphere, vol. 14, pp. 2951-2973, doi: 10.5194/nhess-14-2951-2014, published.
Finite element method analytical understanding was implemen ted to simulate the 1-dimensional break up model fot a jet spray . Programming was done in MATLAB
1. Text reference, Chapter 6
2. Special case of the general factorial design; k factors, all at two levels
3. The two levels are usually called low and high (they could be either quantitative or qualitative)
4. Very widely used in industrial experimentation
5. Form a basic “building block” for other very useful experimental designs (DNA)
6. Special (short-cut) methods for analysis
7. We will make use of Design-Expert
United Kingdom: +44-1143520021
India: +044 3318-2000
Email: info@statswork.com
Website: www.statswork.com
S1 - Process product optimization using design experiments and response surfa...CAChemE
An intensive practical course mainly for PhD-students on the use of designs of experiments (DOE) and response surface methodology (RSM) for optimization problems. The course covers relevant background, nomenclature and general theory of DOE and RSM modelling for factorial and optimisation designs in addition to practical exercises in Matlab. Due to time limitations, the course concentrates on linear and quadratic models on the k≤3 design dimension. This course is an ideal starting point for every experimental engineering wanting to work effectively, extract maximal information and predict the future behaviour of their system.
Mikko Mäkelä (DSc, Tech) is a postdoctoral fellow at the Swedish University of Agricultural Sciences in Umeå, Sweden and is currently visiting the Department of Chemical Engineering at the University of Alicante. He is working in close cooperation with Paul Geladi, Professor of Chemometrics, and using DOE and RSM for process optimization mainly for the valorization of industrial wastes in laboratory and pilot scales.”
This poster was presented at the 2015 Water Reuse Symposium in Seattle and represents ADMi's analysis of data to determine which water quality parameters have the greatest impact on RO membrane fouling.
Epcon is One of the World's leading Manufacturing Companies.EpconLP
Epcon is One of the World's leading Manufacturing Companies. With over 4000 installations worldwide, EPCON has been pioneering new techniques since 1977 that have become industry standards now. Founded in 1977, Epcon has grown from a one-man operation to a global leader in developing and manufacturing innovative air pollution control technology and industrial heating equipment.
Characterization and the Kinetics of drying at the drying oven and with micro...Open Access Research Paper
The objective of this work is to contribute to valorization de Nephelium lappaceum by the characterization of kinetics of drying of seeds of Nephelium lappaceum. The seeds were dehydrated until a constant mass respectively in a drying oven and a microwawe oven. The temperatures and the powers of drying are respectively: 50, 60 and 70°C and 140, 280 and 420 W. The results show that the curves of drying of seeds of Nephelium lappaceum do not present a phase of constant kinetics. The coefficients of diffusion vary between 2.09.10-8 to 2.98. 10-8m-2/s in the interval of 50°C at 70°C and between 4.83×10-07 at 9.04×10-07 m-8/s for the powers going of 140 W with 420 W the relation between Arrhenius and a value of energy of activation of 16.49 kJ. mol-1 expressed the effect of the temperature on effective diffusivity.
Climate Change All over the World .pptxsairaanwer024
Climate change refers to significant and lasting changes in the average weather patterns over periods ranging from decades to millions of years. It encompasses both global warming driven by human emissions of greenhouse gases and the resulting large-scale shifts in weather patterns. While climate change is a natural phenomenon, human activities, particularly since the Industrial Revolution, have accelerated its pace and intensity
WRI’s brand new “Food Service Playbook for Promoting Sustainable Food Choices” gives food service operators the very latest strategies for creating dining environments that empower consumers to choose sustainable, plant-rich dishes. This research builds off our first guide for food service, now with industry experience and insights from nearly 350 academic trials.
Willie Nelson Net Worth: A Journey Through Music, Movies, and Business Venturesgreendigital
Willie Nelson is a name that resonates within the world of music and entertainment. Known for his unique voice, and masterful guitar skills. and an extraordinary career spanning several decades. Nelson has become a legend in the country music scene. But, his influence extends far beyond the realm of music. with ventures in acting, writing, activism, and business. This comprehensive article delves into Willie Nelson net worth. exploring the various facets of his career that have contributed to his large fortune.
Follow us on: Pinterest
Introduction
Willie Nelson net worth is a testament to his enduring influence and success in many fields. Born on April 29, 1933, in Abbott, Texas. Nelson's journey from a humble beginning to becoming one of the most iconic figures in American music is nothing short of inspirational. His net worth, which estimated to be around $25 million as of 2024. reflects a career that is as diverse as it is prolific.
Early Life and Musical Beginnings
Humble Origins
Willie Hugh Nelson was born during the Great Depression. a time of significant economic hardship in the United States. Raised by his grandparents. Nelson found solace and inspiration in music from an early age. His grandmother taught him to play the guitar. setting the stage for what would become an illustrious career.
First Steps in Music
Nelson's initial foray into the music industry was fraught with challenges. He moved to Nashville, Tennessee, to pursue his dreams, but success did not come . Working as a songwriter, Nelson penned hits for other artists. which helped him gain a foothold in the competitive music scene. His songwriting skills contributed to his early earnings. laying the foundation for his net worth.
Rise to Stardom
Breakthrough Albums
The 1970s marked a turning point in Willie Nelson's career. His albums "Shotgun Willie" (1973), "Red Headed Stranger" (1975). and "Stardust" (1978) received critical acclaim and commercial success. These albums not only solidified his position in the country music genre. but also introduced his music to a broader audience. The success of these albums played a crucial role in boosting Willie Nelson net worth.
Iconic Songs
Willie Nelson net worth is also attributed to his extensive catalog of hit songs. Tracks like "Blue Eyes Crying in the Rain," "On the Road Again," and "Always on My Mind" have become timeless classics. These songs have not only earned Nelson large royalties but have also ensured his continued relevance in the music industry.
Acting and Film Career
Hollywood Ventures
In addition to his music career, Willie Nelson has also made a mark in Hollywood. His distinctive personality and on-screen presence have landed him roles in several films and television shows. Notable appearances include roles in "The Electric Horseman" (1979), "Honeysuckle Rose" (1980), and "Barbarosa" (1982). These acting gigs have added a significant amount to Willie Nelson net worth.
Television Appearances
Nelson's char
UNDERSTANDING WHAT GREEN WASHING IS!.pdfJulietMogola
Many companies today use green washing to lure the public into thinking they are conserving the environment but in real sense they are doing more harm. There have been such several cases from very big companies here in Kenya and also globally. This ranges from various sectors from manufacturing and goes to consumer products. Educating people on greenwashing will enable people to make better choices based on their analysis and not on what they see on marketing sites.
"Understanding the Carbon Cycle: Processes, Human Impacts, and Strategies for...MMariSelvam4
The carbon cycle is a critical component of Earth's environmental system, governing the movement and transformation of carbon through various reservoirs, including the atmosphere, oceans, soil, and living organisms. This complex cycle involves several key processes such as photosynthesis, respiration, decomposition, and carbon sequestration, each contributing to the regulation of carbon levels on the planet.
Human activities, particularly fossil fuel combustion and deforestation, have significantly altered the natural carbon cycle, leading to increased atmospheric carbon dioxide concentrations and driving climate change. Understanding the intricacies of the carbon cycle is essential for assessing the impacts of these changes and developing effective mitigation strategies.
By studying the carbon cycle, scientists can identify carbon sources and sinks, measure carbon fluxes, and predict future trends. This knowledge is crucial for crafting policies aimed at reducing carbon emissions, enhancing carbon storage, and promoting sustainable practices. The carbon cycle's interplay with climate systems, ecosystems, and human activities underscores its importance in maintaining a stable and healthy planet.
In-depth exploration of the carbon cycle reveals the delicate balance required to sustain life and the urgent need to address anthropogenic influences. Through research, education, and policy, we can work towards restoring equilibrium in the carbon cycle and ensuring a sustainable future for generations to come.
Artificial Reefs by Kuddle Life Foundation - May 2024punit537210
Situated in Pondicherry, India, Kuddle Life Foundation is a charitable, non-profit and non-governmental organization (NGO) dedicated to improving the living standards of coastal communities and simultaneously placing a strong emphasis on the protection of marine ecosystems.
One of the key areas we work in is Artificial Reefs. This presentation captures our journey so far and our learnings. We hope you get as excited about marine conservation and artificial reefs as we are.
Please visit our website: https://kuddlelife.org
Our Instagram channel:
@kuddlelifefoundation
Our Linkedin Page:
https://www.linkedin.com/company/kuddlelifefoundation/
and write to us if you have any questions:
info@kuddlelife.org
2. Acknowledgement
Ed Roehl – CTO
• World class industrial researcher;
• Software design, development, and project
management;
• Advanced process engineering, computer-
based modeling and optimization methods,
industrial R&D, product/process design
automation, CAE, PDM;
• Data mining, multivariate analysis, predictive
modeling, simulation, advanced control, signal
processing, non-linear/chaotic systems,
computational geometry;
• AI, expert systems, OOP/computer languages,
machine learning/artificial neural networks.
Uwe Mundry, Partner
• World class software design, development;
• multi--spectral and hyper-spectral imaging and
pattern recognition, 4D medical imaging, 4D
geographical imaging, homeland security
applications, real-time decision support
systems with industrial applications; Data
mining, multivariate analysis, predictive
modeling, simulation, advanced control, signal
processing, non-linear/chaotic systems,
computational geometry, machine
learning/artificial neural networks;
OOP/multiple computer languages; Medical
and environmental imaging.
3. Why optimize your plant?
• Reduced operating budgets (10% very
common)
• Increasingly stringent regulations
--Water treatment?
--Wastewater treatment?
• Increasing cost of capital improvements
--USD worth less
--QE2 will lower value of debt instruments such
as bonds
4. Process optimization by modeling
1. Modeling processes through various
means
a. Bench-scale models
b. Pilot-scale models
c. Mathematical models
1) Deterministic/mechanistic—based on first principles
2) Empirical—either statistical or based upon some
optimal function to describe behavior
3) Hybrid of 1) and 2)
5. Process optimization by modeling
What is a mathematical model?
―…..consistent set of mathematical equations which
is thought to correspond to some other entity, its
prototype.‖—Rutherford Aris
6. Definitions for pilot-scale modeling
• Geometric Similarity—All lengths of the model and the
prototype must be in the same ratio. All corresponding
angles must be equal. [This is the easy one to achieve.]
• Kinematic Similarity—Ratios of fluid velocity and other
relevant velocities must be the same for the model and
prototype. Ratios of flow time scale and boundary time
scale must be the same. [Problems with laminar/turbulent.]
• Dynamic Similarity—The force polygons for the model
and prototype must be proportional. For example, forces
such as inertia, pressure, viscous forces, surface tension
forces, etc.
7. Equations of importance
• R = ρVℓ/µ (very important!)
• W = ρV2ℓ/σ (surface tension effects)
• F = V/ (gℓ)½ (free surface effects)
8. Scale-up problems with models
1. For bench-scale and pilot-scale:
a. Example of problems with scale-up for
simple drag coefficient, CD:
CD = f (R, W, F, α)
[Where is this important for water treatment?]
c. Pilot-scale testing is good for comparing
one pilot train with another pilot train but not
for finding absolute numbers for full-scale
9. So what of models?
―Models are undeniably beautiful, and a man may
justly be proud to be seen in their company. But
they may have their hidden vices. The question is,
after all, not only whether they are good to look at,
but whether we can live happily with them.‖
--Abraham Kaplan, The Conduct of Inquiry
10. Another problem: chaotic behavior
• ―Deterministic evolution of a nonlinear system
which is between regular behavior and
stochastic behavior.” – Abarbanel
• ―The property that characterizes a dynamical
system in which most orbits exhibit sensitive
dependence.” – Lorenz
• ―Neither periodic or stochastic behaviors that
have structure in state/feature space, making
them somewhat predictable.‖– ADMi
11. Lorenz attractor shows problem
• Poster child of chaos
• Purely synthetic, derived from 3 equations
– dx/dt = -σx + σy
– dy/dt = -xz + rx – y
– dz/dt = xy – bz
signal3D delay plot
showing
“orbitals”
“extreme sensitivity to changes
in boundary conditions”
mode 1
mode 2
mode 1
mode 2
14. Modeling chaotic behavior, 1
State Space Reconstruction (SSR)
• SSR is the means by which complex, constantly changing
processes can be represented in straightforward geometric
terms for visualization and modeling. SSR is like super
trending. It suggests that a process’ state space can be
optimally but not perfectly characterized by state vectors
Y(t). The vectors are constructed using an optimal number
of measurements, equal to ―local dimension‖ dL
(Abarbanel,1996), that are spaced optimally apart in time
by integer multiples of an optimal time delay d3.
Mathematically:
• Y(t) = [x(t), x(t - d), x(t - 2d),...., x(t – (dL - 1)d)] eq. 1
• Note that here Y(t) is univariate. Values of dL and d are
estimated analytically or experimentally from the data.
15. Modeling chaotic behavior, 2
• For a multivariate process of k independent variables:
• Y(t) = {[x1(t), x1(t - d1),…, x1(t – (dL1 – 1)d1)],....,[xk(t),
xk(t - dk),…, xk(t – (dLk – 1)dk)]} eq. 2
• This provides each variable with its own dL and d. A further
generalization that provides non-fixed time delay spacing
for each variable:
• Y(t) = {[x1(t), x1(t - d1,1),…, x1(t – (dL1 – 1)d1,dL1-
1)],....,[xk(t), xk(t - dk,1),…, xk(t – (dLk – 1)dk,dLk-1]} eq. 3
• Determining the best variables xk to use, and properly
estimating dimensions dLk and time delays dk by analytical
or experimental means, helps to insure that a given
process can be successfully reconstructed.
17. Consider modeling full-scale
system with full-scale system
1. Approach
a. Use data mining to extract information
contained in the full-scale data
b. Eliminates problems inherent in scale-up
issues
c. Chaotic behavior can be modeled
d. Systematic and objective approach to
optimizing information
19. A view of a general process
PHYSICAL
PROCESS
inputs
outputsx1
x2
x3
x4
x5
x6
x7
x8
y1
y2
y3
multiply periodic
chaotic
stochastic
Causes of Variability
• people
• configuration of controls
• raw water
• weather
• chemicals
• Outputs that are
predictable can then
be controlled
• Outputs that are
unpredictable cannot
be controlled
20. Relate variables with neural
networks
• Inspired by the Brain
– get complicated behaviors from lots of ―simple‖
interconnected devices - neurons and synapses
– non-linear, multivariate curve fitting
– models are synthesized from example data
• machine learning
x1
x2
x3
x4
x5
y1
y2
inputs outputs
21. ANNs produce response surfaces
Example: Trihalomethanes Formation
no data
surface fitted by non-linear
ANN model represents normal
behavior
deviation from normal
better conditions?
23. Modeling chloroform
• Input = TURBFIN (MWA=4,t=-1),
R2
ANN=0.47, RMSE=7.3
• +Input=COLORFIN (MWA=4),
R2
ANN=0.60, RMSE=6.2
• +Input=TPFIN, R2
ANN=0.74,
RMSE=5.0
R2
ANN=0.74
same
TPFIN=32C
TPFIN=11C
CF higher
at high TP
Days when DBPs measured
24. Observations about chloroform
• Finished turbidity accounts for 47% of
variability in chloroform
• Finished turbidity + color accounts for 60%
• Finished turbidity + color + temperature
accounts for 74%
• Or, R2ANN = 0.74
• Recommend:
1) optimize turbidity removal—most
important
Is this counterintuitive?
2) optimize TOC removal
26. Modeling
BDM, Part 1
• Inputs = TURBFIN (t=-2) ,
COLORFIN (MWA=3), R2
ANN=0.24,
RMSE=1.8
• +Input=TPFIN, R2
ANN=0.66,
RMSE=1.2
BDM far more sensitive to
TPFIN than TURBFIN &
COLORFIN
R2
ANN=0.66
TPFIN=32C
TPFIN=11C
Days when DBPs measured
27. Observations regarding BDM
• Finished turbidity + finished color accounts
for 24% [very low correlation!]
• Finished turbidity + color + temperature
accounts for 66%
• Or, R2 = 0.66
• So, BDM is dominated by temperature
28. • Remove TURBFIN, add inputs =
PRE-Cl2, R2
ANN=0.72, RMSE=1.1
Modeling
BDM, Part 2
TPFIN=11C
COLORFIN=3.0
TPFIN=11C
COLORFIN=1.0
TPFIN=32C
COLORFIN=3.0
TPFIN=32C
COLORFIN=1.0
BDM sensitivity
to PRE-Cl2 &
NH3 higher at
low TPFIN.
BDM higher at
higher
COLORFIN.
TP is dominant
effect.
29. Modeling TCA
• Input = TURBFIN (MWA=4,t=-3),
R2
ANN=0.47, RMSE=5.5
• +Input=COLORFIN (MWA=4),
R2
ANN=0.47, RMSE=5.5
• +Input=TPFIN, R2
ANN=0.61,
RMSE=4.7
TPFIN=32C
TPFIN=11C
TCA less seasonal
than DCA
R2
ANN=0.61
Days when DBPs measured
30. Observations modeling TCA
• Finished turbidity accounts for 47%
variability
• Finished turbidity + finished color accounts
for 47% [surprising, as color not capturing
precursors!]
• Finished turbidity + color + finished
temperature accounts for 61%
• Or, R2 = 0.61
31. Summary - modeling THM and
HAA species
• Consider finished turbidity, color, and temperature
– indicators of organics speciation by time of year
– treatment process kinetics and performance
• Chloroform positively correlated to finished turbidity, color,
and temperature; R2
ANN = 0.74
• BDM highly seasonal; positively correlated to and finished
turbidity, color, and temperature, and pre-Cl2 and NH3;
R2
ANN = 0.66 to 0.72
• DCA highly seasonal; positively correlated by to finished
turbidity, color, and temperature; R2
ANN = 0.73
• TCA somewhat seasonal; positively correlated by to
finished turbidity, and temperature; R2
ANN = 0.61
39. Observations for % TOC removal
• Optimal coagulation pH = 6.5
• Coagulation aid = 0.05 mg/L (or < )
– However, coagulant aid does effect turbidity
• ClO2 = 0.8 mg/L
• Coagulant dose as function of [TOC]
41. Total % turbidity removal
• System is robust in removal of turbidity regardless of source turbidity
levels; when source turbidity increases, % removal asymptotically
approaches –100%
• Goal is to minimize operating costs to meet water quality targets
44. Observations % filtration turbidity
removal
1. Turbidity removal through filtration is highly
sensitive to:
a. coagulant dose
b. chlorine dioxide dose
2. Turbidity removal through filtration is NOT
sensitive to filter polymer aid
3. Turbidity removal = f (sed. turbidity + ClO2 +
coagulant + coagulant aid); R2 = 0.75
4. Filter run times very low; recommend eliminating
filter polymer aid
5. Recommend side-by-side filter testing
47. Observations about tank water
quality
• Nitrification demonstrated by loss of total
chlorine residual, lower pH, higher NO-
2
• Total chlorine loss is pH sensitive
• Total chlorine loss is very temperature
dependent
– Nitrification rate increases exponentially above
approximately 80 F
• At pH > 9, loss of residual stabilizes
48. Questions
John B. Cook, PE
Advanced Data Mining Intl,
Greenville, SC
John.Cook@advdmi.com
843.513.2130
www.advdmi.com