Gurgaon âĄď¸9711147426â¨Call In girls Gurgaon Sector 51 escort service
Â
CI_module1.pptx .
1. Computational Intelligence and Environmental
Sustainability
Dr.Supreetha B.S
Dept. Of Electronics & Communication Engg.
Department of Electronics & Communication Engineering 1
3. References
⢠An Introduction to Computational Intelligence. In: Computational Intelligence.
Springer, Berlin, Heidelberg, 2005.
⢠Computational Intelligence An Introduction,Andries P Engelbrecht, WILEY
⢠Computational Intelligence, Concepts to Implementation, by Russell C. Eberhart
and Dr. Yuhui Shi, Morgan Kaufmann Publishers, An imprint of Elsevier by
Elsevier Inc,2007.
⢠An introduction to Contemporary Remote sensing, Qihao Weng, McGraw-Hill
Publication, 2012.
Department of Electronics & Communication Engineering 3
4. Complexity in Computational sustainability
problems
Department of Electronics & Communication Engineering 4
Computational Sustainability is a new
interdisciplinary field that aims to apply
techniques from computer science and
related fields, such as information
science, operations research, applied
mathematics, and statistics, to help
manage the balance between
environmental, economic, and societal
needs for a sustainable future.
5. Computational Intelligence Paradigms
⢠Five main paradigms of CI are :
⢠Soft Computing, a term coined by Zadeh, is a different grouping of
paradigms.
⢠The arrow indicates that techniques from different paradigms can be
combined to form hybrid systems.
Department of Electronics & Communication Engineering 5
6. AI opportunity for our environment
Department of Electronics & Communication Engineering 6
7. Introduction
⢠Introduction to Computational Intelligence,
⢠Historical views of Computational Intelligence
⢠Computational Intelligence Paradigms
⢠Modeling and Optimization
⢠Performance Metrics and Analysis
⢠Environmental issues and sustainable development
⢠Need and concept of sustainability: Regional and Global environmental issues
⢠Climate change, Global warming, Conservation and Management of Resources for
Development
⢠challenges for sustainable development.
⢠QGIS spatial and temporal data analysis
⢠QGIS Spatial and Temporal analysis
⢠Spatial Interpolation, Artificial Intelligence tools and platforms for GIS
⢠Application of Computational intelligence in Environmental Prediction.
⢠Computational Intelligence in time series forecasting
Department of Electronics & Communication Engineering 7
8. INTRO to CI
⢠CI: main Umbrella: Natural Computing
⢠Why from nature????
⢠natural clocks ???
⢠the processes are well done and successfully and for years
⢠we see the natural process as information processing systems.
⢠it is a totally interdisciplinary topic
Department of Electronics & Communication Engineering 8
9. ⢠CI : definition: is a set of nature inspired computational
methodologies and approaches.
⢠CI is a specific subset of AI, while AI focusses on the outcome and CI
focusses on the mechanisms
⢠CI paradigms
⢠how to make an algorithm
⢠CI Modelling methodology
Department of Electronics & Communication Engineering 9
10. ⢠Dartmouth Conference, 1956 : When we talk about
contemporary iterations of âartificial intelligence,â we are using
words coined by John McCarthy, a 28-year-old professor at
Dartmouth College in 1956. The term originated from a
conference on machine learning organized by McCarthy and
other professors at Dartmouth College.
ď§ Godfather of artificial intelligence: Frank Rosenblatt, 1928-
1971:
ď§ Frank Rosenblatt was a research psychologist at the Cornell
Aeronautical Laboratory and a pioneer in the use of biology to
inspire research in AI. His work led him to create the perceptron
in 1958, an electronic device designed to mimic neural networks
found in the human brain and enable pattern recognition. The
perceptron was first simulated on an early IBM computer by
Rosenblatt and was later developed by the U.S. Navy.
Department of Electronics & Communication Engineering 10
11. Historical perspective/milestones
⢠Machine learning relies on memorizing patterns, in order to
simulate human actions or thought. Back in the 17th century,
thinkers like Gottfried Wilhelm von Leibniz sought to represent
human cognition in computational terms.
⢠In 1673 Leibniz built the Step Reckoner, a machine that could
not just add and subtract, but also multiply and divide, by the
turning of a hand crank that rotated a series of drums. Further
advances in algebra began to provide the mathematical
language to express a much wider range of ideas and open up
vast
ď§ Alan Turing, 1912-1954n: As early as 1947, he spoke publicly
about a âmachine that can learn from experience.â His 1950
method of determining whether a computer is capable of
thinking like a human being â known as the Turing Test â is
Department of Electronics & Communication Engineering 11
12. 20th-century science-fiction
⢠In 2016, Ross Goodwin, an AI researcher at New York University teamed
up with director Oscar Sharp to create a bizarre, machine-written film.
⢠In the last two decades, there have been a number of high-profile
demonstrations of AIâs superiority over mere mortals. In 1997, IBMâs Deep
Blue beat world champion Garry Kasparov at chess, becoming the first
supercomputer to defeat a reigning world champion. Another milestone
was reached in 2011, when a computer system named Watson won $1
million on the U.S. TV game show âJeopardy.â Then, in 2015, Googleâs
AlphaGo technology trounced Fan Hui â Europeâs top human player â at
the ancient Chinese board game Go. However, things donât always run so
smoothly. Take, for instance, the time in 2016 when a lifelike robot named
Sophia declared she would âdestroy humansâ during a demonstration at
the South by Southwest conference. The statement was in response to
what was apparently a joke from Sophiaâs creator David Hanson.
Department of Electronics & Communication Engineering 12
13. Fear and Innovation
⢠Recent innovations in artificial intelligence have left almost no area of
contemporary life and work untouched. Many of our homes are now
powered by âsmartâ devices like Amazonâs Alexa and Google Nest. AI has
also unleashed massive changes in medicine, agriculture and finance.
Many of these examples have been positive, but the drawbacks are also
increasingly apparent as governments and workers worry about how AI
processes focused on efficiency could lead to massive job losses.
⢠In 2019, IBM reported that 120 million workers around the world will need
retraining in the next three years, while Fortune magazine wrote that about
38% of location-based jobs will become automated in the next decade. In
2015, the late British physicist Stephen Hawking stated that the technology
is already so advanced that computers will overtake humans âat some
point within the next 100 years.â His prediction was meant as a warning.
âWe need to make sure the computers have goals aligned with
ours,â he said.
Department of Electronics & Communication Engineering
14. AI moves to the city
⢠National and local governments around the world have
incorporated AI into systems designed to manage and
streamline city infrastructure and services. There are currently
just over 1,000 smart city projects in countries including China,
Brazil and Saudi Arabia, according to research from the British
financial firm Deloitte. This is where the technology is leaving its
biggest footprint on our world.
Department of Electronics & Communication Engineering 14
17. Complexity in Computational sustainability
problems
Department of Electronics & Communication Engineering 17
Computational Sustainability is a new
interdisciplinary field that aims to apply
techniques from computer science and
related fields, such as information
science, operations research, applied
mathematics, and statistics, to help
manage the balance between
environmental, economic, and societal
needs for a sustainable future.
18. Introduction to Computational Intelligence
⢠Computational Intelligence comprises practical adaptation and self-
organization concepts, Paradigms, algorithms and implementations
that enable or facilitate appropriate actions( intelligent behavior) in
complex and changing environment .
⢠Another definition is CI is Sub-branch of AI,- the study of adaptive
mechanisms to enable or facilitate intelligent behavior in complex and
changing environments.
⢠Theses mechanisms include those AI paradigms that exhibit an ability
to learn or adapt to new situations ,to generalize ,abstract, discover
and associate.
⢠A more recent definition of artificial Intelligence came from the IEEE
Neural Networks Council of 1996: the study of how to make
computers do things at which people are doing better.
Department of Electronics & Communication Engineering 18
26. Computational Intelligence Paradigms
⢠Five main paradigms of CI are :
⢠Soft Computing, a term coined by Zadeh, is a different grouping of
paradigms.
⢠The arrow indicates that techniques from different paradigms can be
combined to form hybrid systems.
Department of Electronics & Communication Engineering 26
27. Hybrid approaches
⢠Concept of Hybrid CI
⢠Hybris algorithms are produced by combination of global optimization
algorithm with a local search algorithm
⢠Hybrid combination
⢠ANN-GA hybrid model
⢠ANN-PSO model
⢠ANN-ABC-PSO
⢠Need for Hybrid combinations
Department of Electronics & Communication Engineering 27
28. Computational Intelligence Paradigms
Department of Electronics & Communication Engineering 28
⢠Each of the CI paradigms has its origins in biological systems.
⢠NN model biological neural system, EC models natural evolution, SI
models the social behavior of organisms living in swarms or colonies.
⢠AIS models the human immune system and FS originated from the
studies of how organisms interact with their environment.
29. Natural systems & metaheuristic algorithms
⢠Man has learned much from studies of natural systems ,using what
has been learned to develop new algorithmic models to solve
complex problems.
⢠If we take out some time, if not a lot, we will start to observe
distinct trends/patterns and behavior exhibited by different
species in our natural environment.
⢠we humans are inspired enough to simulate their behavior on
machines to solve various complexities.
⢠PSO is an algorithm imitating and inspired by the behavior of a
swarm of birds.
Department of Electronics & Communication Engineering 29
30. Example: PSO
⢠Optimize hyper-parameters of my
machine learning model using
something called Particle Swarm
Optimization (PSO).
⢠PSO is an algorithm imitating and
inspired by the behavior of a swarm of
birds.
⢠observe a pattern in how swarms of
birds fly together, and mathematical
emulation based out of their
observation.
Department of Electronics & Communication Engineering 30
31. ⢠Fitness of each particle (analogous to bird in nature) is
evaluated based on some condition.
⢠Then a global best (particle at the best position) is selected,
and each particle updates its position and evaluates its local
best (the best position of an individual particle in a swarm at
every instant) and every time compares its position with the
position of global best.
⢠The particle, based on above step updates its position and
velocity according to:
⢠Above steps are repeated till the stopping criteria is met or
solution is optimized, or search is successfully completed.
Department of Electronics & Communication Engineering 31
33. Neurons vs. Units (1)
- Each element of NN is a node called unit.
- Units are connected by links.
- Each link has a numeric weight.
34. Important terminologies
- weights: each neuron is connected to other neuron by means of
links and each communication link is associated with weights,
which contain information about the signal,which is used by the
net to solve a problem.
- Bias : adding a component which is like a another weighthelps in
varying the network output.
- Activation function: threshold which helps network to use
important information and suppress irrelevant data points.
36. Important steps
Steps:
1. Take the input and calculate weighted sum
2. add the bias to sum
3. Feed the result to activation function
4. transmit the output to next layer.
37. How NN learns a task.
Issues to be discussed
- Initializing the weights.
- Use of a learning algorithm.
- Set of training examples.
- Encode the examples as inputs.
- Convert output into meaningful results.
44. Activation Functions
- Use different functions to obtain different models.
- 3 most common choices :
1) Step function
2) Sign function
3) Sigmoid function
- An output of 1 represents firing of a neuron down the
axon.
45. Artificial Neural Networks
⢠Human brain is complex ,nonlinear and parallel computer , ability to
learn, memorize and still generate prompted research in algorithmic
modelling of biological neural systems-referred to as Artificial Neural
Network.
Department of Electronics & Communication Engineering 45
46. ANN: Introduction
⢠resembles the characteristic of biological neural network
⢠Nodes: neurons interconnected processing elements : each
connection link is associated with weight which has information
about the input signal.
⢠internal state of neurons is called activation function
⢠2-3-1 architecture
⢠number of weights in a NN is
⢠đđ¤ = đź + 1 â đť + đť + 1 â đ
Department of Electronics & Communication Engineering 46
48. FFNN numerical
1. The inputs are (0.10,0.90,0.05) and the corresponding weights are
(2,5,2). Bias is given to be 1.
a) Calculate the net input.
b) the activation function I logistic. calculate the output for a neuron
.Also draw the neuron architecture.
2. Consider the network with [x1,x2,x3]= [0.3,0.5,0.6] &
[w1,w2,w3]=[0.2,0.1,-0.3]. Calculate net input .
3.Obtain the output of the neuron Y for the network with input
[x1,x2,x3]=[0.8,0.6,0.4] and the weights are [w1,w2,w3]=[0.1,0.3,-0.2]
with bias b=0.35 9its input is always 1]. Obtain the output of the
neuron Y using activation functions a) Binary Sigmoidal ii) Bipolar
sigmoidal.
Department of Electronics & Communication Engineering 48
49. weâre going to work with a single training set: given inputs 0.05
and 0.10, we want the neural network to output 0.01 and 0.99.
⢠We figure out the total net input to each hidden layer
neuron, squash the total net input using an activation
function (here we use the logistic function), then repeat the
process with the output layer neurons.
â˘
Department of Electronics & Communication Engineering 49
51. Introduction to
Backpropagation
- In 1969 a method for learning in multi-layer network,
Backpropagation, was invented by Bryson and Ho.
- The Backpropagation algorithm is a sensible approach
for dividing the contribution of each weight.
- Works basically the same as perceptrons
52. Backpropagation Learning Details
⢠Method for learning weights in feed-forward (FF) nets
⢠Canât use Perceptron Learning Rule
⢠no teacher values are possible for hidden units
⢠Use gradient descent to minimize the error
⢠propagate deltas to adjust for errors
backward from outputs
to hidden layers
to inputs
forward
backward
53. Backpropagation Network
training
⢠1. Initialize network with random weights
⢠2. For all training cases (called examples):
⢠a. Present training inputs to network and calculate output
⢠b. For all layers (starting with output layer, back to input layer):
⢠i. Compare network output with correct output
(error function)
⢠ii. Adapt weights in current layer
This is
what you
want
54. Backpropagation Algorithm
⢠Initialize weights (typically random!)
⢠Keep doing epochs
⢠For each example e in training set do
⢠forward pass to compute
⢠O = neural-net-output(network,e)
⢠miss = (T-O) at each output unit
⢠backward pass to calculate deltas to weights
⢠update all weights
⢠end
⢠until tuning set error stops improving
Backward pass explained in next slide
Forward pass explained earlier
57. weâre going to work with a single training set: given inputs 0.05
and 0.10, we want the neural network to output 0.01 and 0.99.
⢠We figure out the total net input to each hidden layer
neuron, squash the total net input using an activation
function (here we use the logistic function), then repeat the
process with the output layer neurons.
â˘
Department of Electronics & Communication Engineering 57
61. The Backwards Pass
⢠Our goal with backpropagation is to update each of the weights
in the network so that they cause the actual output to be closer
the target output, thereby minimizing the error for each output
neuron and the network as a whole.
â˘
Department of Electronics & Communication Engineering
72. Conclusion
Department of Electronics & Communication Engineering 72
Finally, weâve updated all of our weights! When we fed
forward the 0.05 and 0.1 inputs originally, the error on the
network was 0.298371109. After this first round of
backpropagation, the total error is now down to
0.291027924. It might not seem like much, but after
repeating this process 10,000 times, for example, the
error plummets to 0.0000351085. At this point, when we
feed forward 0.05 and 0.1, the two outputs neurons
generate 0.015912196 (vs 0.01 target) and 0.984065734
(vs 0.99 target)
73. Issues with traditional algorithms
â˘
⢠are mostly local search thus cannot guarantee global optimality
⢠results often depends on the initial starting points
⢠methods tend to problem specific
⢠cannot deal with highly non linear optimization problems efficiently
⢠Struggle to cope problems with dimensionality.
Department of Electronics & Communication Engineering 73
74. Local minima & Global minima
⢠global verses local minima : ANN does not guarantee the finding of a
global minimum.
⢠Overfitting problem : good training performance but poor predictive
performance to unknown data due to noise contained in data. Lack of
generalization.
⢠in order to avoid overfitting, it is necessary to use technique such as
cross validation or Early stopping.
â˘
Department of Electronics & Communication Engineering 74