SlideShare a Scribd company logo
1 of 91
Download to read offline
Neural Networks: MATLAB examples
Neural Networks course (practical examples) © 2012 Primoz Potocnik
Primoz Potocnik
University of Ljubljana
Faculty of Mechanical Engineering
LASIN - Laboratory of Synergetics
www.neural.si | primoz.potocnik@fs.uni-lj.si
Contents
1. nn02_neuron_output - Calculate the output of a simple neuron
2. nn02_custom_nn - Create and view custom neural networks
3. nn03_perceptron - Classification of linearly separable data with a perceptron
4. nn03_perceptron_network - Classification of a 4-class problem with a 2-neuron perceptron
5. nn03_adaline - ADALINE time series prediction with adaptive linear filter
6. nn04_mlp_xor - Classification of an XOR problem with a multilayer perceptron
7. nn04_mlp_4classes - Classification of a 4-class problem with a multilayer perceptron
8. nn04_technical_diagnostic - Industrial diagnostic of compressor connection rod defects [data2.zip]
9. nn05_narnet - Prediction of chaotic time series with NAR neural network
10. nn06_rbfn_func - Radial basis function networks for function approximation
11. nn06_rbfn_xor - Radial basis function networks for classification of XOR problem
12. nn07_som - 1D and 2D Self Organized Map
13. nn08_tech_diag_pca - PCA for industrial diagnostic of compressor connection rod defects [data2.zip]
Page 1 of 91
Neuron output
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Calculate the output of a simple neuron
Contents
q Define neuron parameters
q Define input vector
q Calculate neuron output
q Plot neuron output over the range of inputs
Define neuron parameters
close all, clear all, clc, format compact
% Neuron weights
w = [4 -2]
% Neuron bias
b = -3
% Activation function
func = 'tansig'
% func = 'purelin'
% func = 'hardlim'
% func = 'logsig'
w =
4 -2
b =
-3
func =
tansig
Define input vector
p = [2 3]
p =
2 3
Calculate neuron output
activation_potential = p*w'+b
Page 2 of 91
neuron_output = feval(func, activation_potential)
activation_potential =
-1
neuron_output =
-0.7616
Plot neuron output over the range of inputs
[p1,p2] = meshgrid(-10:.25:10);
z = feval(func, [p1(:) p2(:)]*w'+b );
z = reshape(z,length(p1),length(p2));
plot3(p1,p2,z)
grid on
xlabel('Input 1')
ylabel('Input 2')
zlabel('Neuron output')
Published with MATLAB® 7.14
Page 3 of 91
Custom networks
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Create and view custom neural networks
Contents
q Define one sample: inputs and outputs
q Define and custom network
q Define topology and transfer function
q Configure network
q Train net and calculate neuron output
Define one sample: inputs and outputs
close all, clear all, clc, format compact
inputs = [1:6]' % input vector (6-dimensional pattern)
outputs = [1 2]' % corresponding target output vector
inputs =
1
2
3
4
5
6
outputs =
1
2
Define and custom network
% create network
net = network( ...
1, ... % numInputs, number of inputs,
2, ... % numLayers, number of layers
[1; 0], ... % biasConnect, numLayers-by-1 Boolean vector,
[1; 0], ... % inputConnect, numLayers-by-numInputs Boolean matrix,
[0 0; 1 0], ... % layerConnect, numLayers-by-numLayers Boolean matrix
[0 1] ... % outputConnect, 1-by-numLayers Boolean vector
);
% View network structure
view(net);
Page 4 of 91
Define topology and transfer function
% number of hidden layer neurons
net.layers{1}.size = 5;
% hidden layer transfer function
net.layers{1}.transferFcn = 'logsig';
view(net);
Configure network
net = configure(net,inputs,outputs);
view(net);
Train net and calculate neuron output
Page 5 of 91
% initial network response without training
initial_output = net(inputs)
% network training
net.trainFcn = 'trainlm';
net.performFcn = 'mse';
net = train(net,inputs,outputs);
% network response after training
final_output = net(inputs)
initial_output =
0
0
final_output =
1.0000
2.0000
Published with MATLAB® 7.14
Page 6 of 91
Classification of linearly separable data with a perceptron
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Two clusters of data, belonging to two classes, are defined in a 2-dimensional input space. Classes are
linearly separable. The task is to construct a Perceptron for the classification of data.
Contents
q Define input and output data
q Create and train perceptron
q Plot decision boundary
Define input and output data
close all, clear all, clc, format compact
% number of samples of each class
N = 20;
% define inputs and outputs
offset = 5; % offset for second class
x = [randn(2,N) randn(2,N)+offset]; % inputs
y = [zeros(1,N) ones(1,N)]; % outputs
% Plot input samples with PLOTPV (Plot perceptron input/target vectors)
figure(1)
plotpv(x,y);
Page 7 of 91
Create and train perceptron
net = perceptron;
net = train(net,x,y);
view(net);
Plot decision boundary
figure(1)
plotpc(net.IW{1},net.b{1});
Page 8 of 91
Published with MATLAB® 7.14
Page 9 of 91
Classification of a 4-class problem with a perceptron
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Perceptron network with 2-inputs and 2-outputs is trained to classify input vectors into 4 categories
Contents
q Define data
q Prepare inputs & outputs for perceptron training
q Create a perceptron
q Train a perceptron
q How to use trained perceptron
Define data
close all, clear all, clc, format compact
% number of samples of each class
K = 30;
% define classes
q = .6; % offset of classes
A = [rand(1,K)-q; rand(1,K)+q];
B = [rand(1,K)+q; rand(1,K)+q];
C = [rand(1,K)+q; rand(1,K)-q];
D = [rand(1,K)-q; rand(1,K)-q];
% plot classes
plot(A(1,:),A(2,:),'bs')
hold on
grid on
plot(B(1,:),B(2,:),'r+')
plot(C(1,:),C(2,:),'go')
plot(D(1,:),D(2,:),'m*')
% text labels for classes
text(.5-q,.5+2*q,'Class A')
text(.5+q,.5+2*q,'Class B')
text(.5+q,.5-2*q,'Class C')
text(.5-q,.5-2*q,'Class D')
% define output coding for classes
a = [0 1]';
b = [1 1]';
c = [1 0]';
d = [0 0]';
% % Why this coding doesn't work?
% a = [0 0]';
% b = [1 1]';
% d = [0 1]';
Page 10 of 91
% c = [1 0]';
% % Why this coding doesn't work?
% a = [0 1]';
% b = [1 1]';
% d = [1 0]';
% c = [0 1]';
Prepare inputs & outputs for perceptron training
% define inputs (combine samples from all four classes)
P = [A B C D];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ...
repmat(c,1,length(C)) repmat(d,1,length(D)) ];
%plotpv(P,T);
Create a perceptron
net = perceptron;
Train a perceptron
ADAPT returns a new network object that performs as a better classifier, the network output, and the error. This loop allows the
network to adapt for xx passes, plots the classification line, and continues until the error is zero.
Page 11 of 91
E = 1;
net.adaptParam.passes = 1;
linehandle = plotpc(net.IW{1},net.b{1});
n = 0;
while (sse(E) & n<1000)
n = n+1;
[net,Y,E] = adapt(net,P,T);
linehandle = plotpc(net.IW{1},net.b{1},linehandle);
drawnow;
end
% show perceptron structure
view(net);
Page 12 of 91
How to use trained perceptron
% For example, classify an input vector of [0.7; 1.2]
p = [0.7; 1.2]
y = net(p)
% compare response with output coding (a,b,c,d)
p =
0.7000
1.2000
y =
1
1
Published with MATLAB® 7.14
Page 13 of 91
ADALINE time series prediction
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Construct an ADALINE for adaptive prediction of time series based on past time series data
Contents
q Define input and output data
q Prepare data for neural network toolbox
q Define ADALINE neural network
q Adaptive learning of the ADALINE
q Plot results
Define input and output data
close all, clear all, clc, format compact
% define segments of time vector
dt = 0.01; % time step [seconds]
t1 = 0 : dt : 3; % first time vector [seconds]
t2 = 3+dt : dt : 6; % second time vector [seconds]
t = [t1 t2]; % complete time vector [seconds]
% define signal
y = [sin(4.1*pi*t1) .8*sin(8.3*pi*t2)];
% plot signal
plot(t,y,'.-')
xlabel('Time [sec]');
ylabel('Target Signal');
grid on
ylim([-1.2 1.2])
Page 14 of 91
Prepare data for neural network toolbox
% There are two basic types of input vectors: those that occur concurrently
% (at the same time, or in no particular time sequence), and those that
% occur sequentially in time. For concurrent vectors, the order is not
% important, and if there were a number of networks running in parallel,
% you could present one input vector to each of the networks. For
% sequential vectors, the order in which the vectors appear is important.
p = con2seq(y);
Define ADALINE neural network
% The resulting network will predict the next value of the target signal
% using delayed values of the target.
inputDelays = 1:5; % delayed inputs to be used
learning_rate = 0.2; % learning rate
% define ADALINE
net = linearlayer(inputDelays,learning_rate);
Adaptive learning of the ADALINE
% Given an input sequence with N steps the network is updated as follows.
% Each step in the sequence of inputs is presented to the network one at
% a time. The network's weight and bias values are updated after each step,
Page 15 of 91
% before the next step in the sequence is presented. Thus the network is
% updated N times. The output signal and the error signal are returned,
% along with new network.
[net,Y,E] = adapt(net,p,p);
% view network structure
view(net)
% check final network parameters
disp('Weights and bias of the ADALINE after adaptation')
net.IW{1}
net.b{1}
Weights and bias of the ADALINE after adaptation
ans =
0.7179 0.4229 0.1552 -0.1203 -0.4159
ans =
-1.2520e-08
Plot results
% transform result vectors
Y = seq2con(Y); Y = Y{1};
E = seq2con(E); E = E{1};
% start a new figure
figure;
% first graph
subplot(211)
plot(t,y,'b', t,Y,'r--');
legend('Original','Prediction')
grid on
xlabel('Time [sec]');
ylabel('Target Signal');
ylim([-1.2 1.2])
% second graph
subplot(212)
plot(t,E,'g');
grid on
Page 16 of 91
legend('Prediction error')
xlabel('Time [sec]');
ylabel('Error');
ylim([-1.2 1.2])
Published with MATLAB® 7.14
Page 17 of 91
Solving XOR problem with a multilayer perceptron
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. (A,C) and (B,D) clusters represent XOR classification problem. The task is
to define a neural network for solving the XOR problem.
Contents
q Define 4 clusters of input data
q Define output coding for XOR problem
q Prepare inputs & outputs for network training
q Create and train a multilayer perceptron
q plot targets and network response to see how good the network learns the data
q Plot classification result for the complete input space
Define 4 clusters of input data
close all, clear all, clc, format compact
% number of samples of each class
K = 100;
% define 4 clusters of input data
q = .6; % offset of classes
A = [rand(1,K)-q; rand(1,K)+q];
B = [rand(1,K)+q; rand(1,K)+q];
C = [rand(1,K)+q; rand(1,K)-q];
D = [rand(1,K)-q; rand(1,K)-q];
% plot clusters
figure(1)
plot(A(1,:),A(2,:),'k+')
hold on
grid on
plot(B(1,:),B(2,:),'bd')
plot(C(1,:),C(2,:),'k+')
plot(D(1,:),D(2,:),'bd')
% text labels for clusters
text(.5-q,.5+2*q,'Class A')
text(.5+q,.5+2*q,'Class B')
text(.5+q,.5-2*q,'Class A')
text(.5-q,.5-2*q,'Class B')
Page 18 of 91
Define output coding for XOR problem
% encode clusters a and c as one class, and b and d as another class
a = -1; % a | b
c = -1; % -------
b = 1; % d | c
d = 1; %
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B C D];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ...
repmat(c,1,length(C)) repmat(d,1,length(D)) ];
% view inputs |outputs
%[P' T']
Create and train a multilayer perceptron
% create a neural network
net = feedforwardnet([5 3]);
% train net
net.divideParam.trainRatio = 1; % training set [%]
net.divideParam.valRatio = 0; % validation set [%]
net.divideParam.testRatio = 0; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,P,T);
% show network
view(net)
Page 19 of 91
plot targets and network response to see how good the network learns the data
figure(2)
plot(T','linewidth',2)
hold on
plot(Y','r--')
grid on
legend('Targets','Network response','location','best')
ylim([-1.25 1.25])
Plot classification result for the complete input space
% generate a grid
span = -1:.005:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simulate neural network on a grid
aa = net(pp);
% translate output into [-1,1]
%aa = -1 + 2*(aa>0);
% plot classification regions
figure(1)
mesh(P1,P2,reshape(aa,length(span),length(span))-5);
colormap cool
Page 20 of 91
view(2)
Published with MATLAB® 7.14
Page 21 of 91
Classification of a 4-class problem with a multilayer perceptron
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. The task is to define a neural network for classification of arbitrary point in
the 2-dimensional space into one of the classes (A,B,C,D).
Contents
q Define 4 clusters of input data
q Define output coding for all 4 clusters
q Prepare inputs & outputs for network training
q Create and train a multilayer perceptron
q Evaluate network performance and plot results
q Plot classification result for the complete input space
Define 4 clusters of input data
close all, clear all, clc, format compact
% number of samples of each class
K = 100;
% define 4 clusters of input data
q = .6; % offset of classes
A = [rand(1,K)-q; rand(1,K)+q];
B = [rand(1,K)+q; rand(1,K)+q];
C = [rand(1,K)+q; rand(1,K)-q];
D = [rand(1,K)-q; rand(1,K)-q];
% plot clusters
figure(1)
plot(A(1,:),A(2,:),'k+')
hold on
grid on
plot(B(1,:),B(2,:),'b*')
plot(C(1,:),C(2,:),'kx')
plot(D(1,:),D(2,:),'bd')
% text labels for clusters
text(.5-q,.5+2*q,'Class A')
text(.5+q,.5+2*q,'Class B')
text(.5+q,.5-2*q,'Class C')
text(.5-q,.5-2*q,'Class D')
Page 22 of 91
Define output coding for all 4 clusters
% coding (+1/-1) of 4 separate classes
a = [-1 -1 -1 +1]';
b = [-1 -1 +1 -1]';
d = [-1 +1 -1 -1]';
c = [+1 -1 -1 -1]';
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B C D];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ...
repmat(c,1,length(C)) repmat(d,1,length(D)) ];
Create and train a multilayer perceptron
% create a neural network
net = feedforwardnet([4 3]);
% train net
net.divideParam.trainRatio = 1; % training set [%]
net.divideParam.valRatio = 0; % validation set [%]
net.divideParam.testRatio = 0; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,P,T);
% show network
view(net)
Page 23 of 91
Evaluate network performance and plot results
% evaluate performance: decoding network response
[m,i] = max(T); % target class
[m,j] = max(Y); % predicted class
N = length(Y); % number of all samples
k = 0; % number of missclassified samples
if find(i-j), % if there exist missclassified samples
k = length(find(i-j)); % get a number of missclassified samples
end
fprintf('Correct classified samples: %.1f%% samplesn', 100*(N-k)/N)
% plot network output
figure;
subplot(211)
plot(T')
title('Targets')
ylim([-2 2])
grid on
subplot(212)
plot(Y')
title('Network response')
xlabel('# sample')
ylim([-2 2])
grid on
Correct classified samples: 100.0% samples
Page 24 of 91
Plot classification result for the complete input space
% generate a grid
span = -1:.01:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = net(pp);
% plot classification regions based on MAX activation
figure(1)
m = mesh(P1,P2,reshape(aa(1,:),length(span),length(span))-5);
set(m,'facecolor',[1 0.2 .7],'linestyle','none');
hold on
m = mesh(P1,P2,reshape(aa(2,:),length(span),length(span))-5);
set(m,'facecolor',[1 1.0 0.5],'linestyle','none');
m = mesh(P1,P2,reshape(aa(3,:),length(span),length(span))-5);
set(m,'facecolor',[.4 1.0 0.9],'linestyle','none');
m = mesh(P1,P2,reshape(aa(4,:),length(span),length(span))-5);
set(m,'facecolor',[.3 .4 0.5],'linestyle','none');
view(2)
Page 25 of 91
Published with MATLAB® 7.14
Page 26 of 91
Industrial diagnostic of compressor connection rod defects
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Industrial production of compressors suffers from problems during the imprinting operation where a
connection rod is connected with a compressor head. Irregular imprinting can cause damage or crack of the connection rod which
results in damaged compressor. Such compressors should be eliminated from the production line but defects of this type are difficult
to detect. The task is to detect crack and overload defects from the measurement of the imprinting force.
Contents
q Photos of the broken connection rod
q Load and plot data
q Prepare inputs: Data resampling
q Define binary output coding: 0=OK, 1=Error
q Create and train a multilayer perceptron
q Evaluate network performance
q Application
Photos of the broken connection rod
Page 27 of 91
Load and plot data
close all, clear all, clc, format compact
% industrial data
load data2.mat
whos
% show data for class 1: OK
figure
plot(force','c')
grid on, hold on
plot(force(find(target==1),:)','b')
xlabel('Time')
ylabel('Force')
title(notes{1})
% show data for class 2: Overload
figure
plot(force','c')
grid on, hold on
plot(force(find(target==2),:)','r')
xlabel('Time')
ylabel('Force')
title(notes{2})
% show data for class 3: Crack
figure
plot(force','c')
Page 28 of 91
grid on, hold on
plot(force(find(target==3),:)','m')
xlabel('Time')
ylabel('Force')
title(notes{3})
Name Size Bytes Class Attributes
force 2000x100 1600000 double
notes 1x3 222 cell
target 2000x1 16000 double
Page 29 of 91
Prepare inputs: Data resampling
Page 30 of 91
% include only every step-th data
step = 10;
force = force(:,1:step:size(force,2));
whos
% show resampled data for class 1: OK
figure
plot(force','c')
grid on, hold on
plot(force(find(target==1),:)','b')
xlabel('Time')
ylabel('Force')
title([notes{1} ' (resampled data)'])
% show resampled data for class 2: Overload
figure
plot(force','c')
grid on, hold on
plot(force(find(target==2),:)','r')
xlabel('Time')
ylabel('Force')
title([notes{2} ' (resampled data)'])
% show resampled data for class 3: Crack
figure
plot(force','c')
grid on, hold on
plot(force(find(target==3),:)','m')
xlabel('Time')
ylabel('Force')
title([notes{3} ' (resampled data)'])
Name Size Bytes Class Attributes
force 2000x10 160000 double
notes 1x3 222 cell
step 1x1 8 double
target 2000x1 16000 double
Page 31 of 91
Page 32 of 91
Define binary output coding: 0=OK, 1=Error
% binary coding 0/1
target = double(target > 1);
Create and train a multilayer perceptron
% create a neural network
net = feedforwardnet([4]);
% set early stopping parameters
net.divideParam.trainRatio = 0.70; % training set [%]
net.divideParam.valRatio = 0.15; % validation set [%]
net.divideParam.testRatio = 0.15; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,force',target');
Evaluate network performance
% digitize network response
threshold = 0.5;
Y = double(Y > threshold)';
Page 33 of 91
% find percentage of correct classifications
correct_classifications = 100*length(find(Y==target))/length(target)
correct_classifications =
99.7500
Application
% get sample
random_index = randi(length(force))
sample = force(random_index,:);
% plot sample
figure
plot(force','c')
grid on, hold on
plot(sample,'b')
xlabel('Time')
ylabel('Force')
% predict quality
q = net(sample');
% digitize network response
q = double(q > threshold)'
% comment network respons
if q==target(random_index)
title(sprintf('Quality: %d (correct network response)',q))
else
title(sprintf('Quality: %d (wrong network response)',q))
end
random_index =
1881
q =
0
Page 34 of 91
Published with MATLAB® 7.14
Page 35 of 91
Prediction of chaotic time series with NAR neural network
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Design a neural network for the recursive prediction of chaotic Mackay-Glass time series, try various network architectures and experiment with various delays.
Contents
q Generate data (Mackay-Glass time series)
q Define nonlinear autoregressive neural network
q Prepare input and target time series data for network training
q Train net
q Transform network into a closed-loop NAR network
q Recursive prediction on validation data
Generate data (Mackay-Glass time series)
close all, clear all, clc, format compact
% data settings
N = 700; % number of samples
Nu = 300; % number of learning samples
% Mackay-Glass time series
b = 0.1;
c = 0.2;
tau = 17;
% initialization
y = [0.9697 0.9699 0.9794 1.0003 1.0319 1.0703 1.1076 1.1352 1.1485 ...
1.1482 1.1383 1.1234 1.1072 1.0928 1.0820 1.0756 1.0739 1.0759]';
% generate Mackay-Glass time series
for n=18:N+99
y(n+1) = y(n) - b*y(n) + c*y(n-tau)/(1+y(n-tau).^10);
end
% remove initial values
y(1:100) = [];
% plot training and validation data
plot(y,'m-')
grid on, hold on
plot(y(1:Nu),'b')
plot(y,'+k','markersize',2)
legend('validation data','training data','sampling markers','location','southwest')
xlabel('time (steps)')
ylabel('y')
ylim([-.5 1.5])
set(gcf,'position',[1 60 800 400])
% prepare training data
yt = con2seq(y(1:Nu)');
% prepare validation data
yv = con2seq(y(Nu+1:end)');
Page 36 of 91
Define nonlinear autoregressive neural network
%---------- network parameters -------------
% good parameters (you don't know 'tau' for unknown process)
inputDelays = 1:6:19; % input delay vector
hiddenSizes = [6 3]; % network structure (number of neurons)
%-------------------------------------
% nonlinear autoregressive neural network
net = narnet(inputDelays, hiddenSizes);
Prepare input and target time series data for network training
% [Xs,Xi,Ai,Ts,EWs,shift] = preparets(net,Xnf,Tnf,Tf,EW)
%
% This function simplifies the normally complex and error prone task of
% reformatting input and target timeseries. It automatically shifts input
% and target time series as many steps as are needed to fill the initial
% input and layer delay states. If the network has open loop feedback,
% then it copies feedback targets into the inputs as needed to define the
% open loop inputs.
%
% net : Neural network
% Xnf : Non-feedback inputs
% Tnf : Non-feedback targets
% Tf : Feedback targets
% EW : Error weights (default = {1})
%
% Xs : Shifted inputs
% Xi : Initial input delay states
% Ai : Initial layer delay states
% Ts : Shifted targets
[Xs,Xi,Ai,Ts] = preparets(net,{},{},yt);
Train net
% train net with prepared training data
net = train(net,Xs,Ts,Xi,Ai);
% view trained net
view(net)
Page 37 of 91
Transform network into a closed-loop NAR network
% close feedback for recursive prediction
net = closeloop(net);
% view closeloop version of a net
view(net);
Recursive prediction on validation data
% prepare validation data for network simulation
yini = yt(end-max(inputDelays)+1:end); % initial values from training data
% combine initial values and validation data 'yv'
[Xs,Xi,Ai] = preparets(net,{},{},[yini yv]);
% predict on validation data
predict = net(Xs,Xi,Ai);
% validation data
Yv = cell2mat(yv);
% prediction
Yp = cell2mat(predict);
% error
e = Yv - Yp;
% plot results of recursive simulation
figure(1)
plot(Nu+1:N,Yp,'r')
plot(Nu+1:N,e,'g')
legend('validation data','training data','sampling markers',...
'prediction','error','location','southwest')
Page 38 of 91
Published with MATLAB® 7.14
Page 39 of 91
Function approximation with RBFN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Create a function approximation model based on a measured data set. Apply various Neural Network architectures based on Radial Basis
Functions. Compare with Multilayer perceptron and Linear regression models.
Contents
q Linear Regression
q Exact RBFN
q RBFN
q GRNN
q RBFN trained by Bayesian regularization
q MLP
q Data generator
Linear Regression
close all, clear all, clc, format compact
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%---------------------------------
% no hidden layers
net = feedforwardnet([]);
% % one hidden layer with linear transfer functions
% net = feedforwardnet([10]);
% net.layers{1}.transferFcn = 'purelin';
% set early stopping parameters
net.divideParam.trainRatio = 1.0; % training set [%]
net.divideParam.valRatio = 0.0; % validation set [%]
net.divideParam.testRatio = 0.0; % test set [%]
% train a neural network
net.trainParam.epochs = 200;
net = train(net,Xtrain,Ytrain);
%---------------------------------
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'color',[1 .4 0])
legend('original function','available data','Linear regression','location','northwest')
Page 40 of 91
Exact RBFN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%---------------------------------
% choose a spread constant
spread = .4;
% create a neural network
net = newrbe(Xtrain,Ytrain,spread);
%---------------------------------
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','Exact RBFN','location','northwest')
Warning: Rank deficient, rank = 53, tol = 1.110223e-13.
Page 41 of 91
RBFN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%---------------------------------
% choose a spread constant
spread = .2;
% choose max number of neurons
K = 40;
% performance goal (SSE)
goal = 0;
% number of neurons to add between displays
Ki = 5;
% create a neural network
net = newrb(Xtrain,Ytrain,goal,spread,K,Ki);
%---------------------------------
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','RBFN','location','northwest')
NEWRB, neurons = 0, MSE = 333.938
NEWRB, neurons = 5, MSE = 47.271
NEWRB, neurons = 10, MSE = 12.3371
NEWRB, neurons = 15, MSE = 9.26908
NEWRB, neurons = 20, MSE = 4.16992
NEWRB, neurons = 25, MSE = 2.82444
NEWRB, neurons = 30, MSE = 2.43353
NEWRB, neurons = 35, MSE = 2.06149
NEWRB, neurons = 40, MSE = 1.94627
Page 42 of 91
GRNN
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%---------------------------------
% choose a spread constant
Page 43 of 91
spread = .12;
% create a neural network
net = newgrnn(Xtrain,Ytrain,spread);
%---------------------------------
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
legend('original function','available data','RBFN','location','northwest')
RBFN trained by Bayesian regularization
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%--------- RBFN ------------------
% choose a spread constant
spread = .2;
% choose max number of neurons
K = 20;
% performance goal (SSE)
goal = 0;
% number of neurons to add between displays
Ki = 20;
% create a neural network
net = newrb(Xtrain,Ytrain,goal,spread,K,Ki);
%---------------------------------
Page 44 of 91
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'r')
% Show RBFN centers
c = net.iw{1};
plot(c,zeros(size(c)),'rs')
legend('original function','available data','RBFN','centers','location','northwest')
%--------- trainbr ---------------
% Retrain a RBFN using Bayesian regularization backpropagation
net.trainFcn='trainbr';
net.trainParam.epochs = 100;
% perform Levenberg-Marquardt training with Bayesian regularization
net = train(net,Xtrain,Ytrain);
%---------------------------------
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'m')
% Show RBFN centers
c = net.iw{1};
plot(c,ones(size(c)),'ms')
legend('original function','available data','RBFN','centers','RBFN + trainbr','new
centers','location','northwest')
NEWRB, neurons = 0, MSE = 334.852
NEWRB, neurons = 20, MSE = 4.34189
Page 45 of 91
MLP
% generate data
[X,Xtrain,Ytrain,fig] = data_generator();
%---------------------------------
% create a neural network
net = feedforwardnet([12 6]);
% set early stopping parameters
net.divideParam.trainRatio = 1.0; % training set [%]
net.divideParam.valRatio = 0.0; % validation set [%]
net.divideParam.testRatio = 0.0; % test set [%]
% train a neural network
net.trainParam.epochs = 200;
net = train(net,Xtrain,Ytrain);
%---------------------------------
% view net
view (net)
% simulate a network over complete input range
Y = net(X);
% plot network response
figure(fig)
plot(X,Y,'color',[1 .4 0])
legend('original function','available data','MLP','location','northwest')
Page 46 of 91
Data generator
type data_generator
%% Data generator function
function [X,Xtrain,Ytrain,fig] = data_generator()
% data generator
X = 0.01:.01:10;
f = abs(besselj(2,X*7).*asind(X/2) + (X.^1.95)) + 2;
fig = figure;
plot(X,f,'b-')
hold on
grid on
% available data points
Ytrain = f + 5*(rand(1,length(f))-.5);
Xtrain = X([181:450 601:830]);
Ytrain = Ytrain([181:450 601:830]);
plot(Xtrain,Ytrain,'kx')
xlabel('x')
ylabel('y')
ylim([0 100])
legend('original function','available data','location','northwest')
Published with MATLAB® 7.14
Page 47 of 91
Page 48 of 91
Radial Basis Function Networks for Classification of XOR problem
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. (A,C) and
(B,D) clusters represent XOR classification problem. The task is to define a neural network for solving the XOR
problem.
Contents
1. Classification of XOR problem with an exact RBFN
2. Classification of XOR problem with a RBFN
3. Classification of XOR problem with a PNN
4. Classification of XOR problem with a GRNN
5. Bayesian regularization for RBFN
Page 49 of 91
Classification of XOR problem with an exact RBFN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to
define a neural network for solving the XOR classification problem.
Contents
q Create input data
q Define output coding
q Prepare inputs & outputs for network training
q Create an exact RBFN
q Evaluate network performance
q Plot classification result
q Plot RBFN centers
Create input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 100;
% offset of clusters
q = .6;
% define 2 groups of input data
A = [rand(1,K)-q rand(1,K)+q;
rand(1,K)+q rand(1,K)-q];
B = [rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)-q];
% plot data
plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*')
grid on
hold on
Page 50 of 91
Define output coding
% coding (+1/-1) for 2-class XOR problem
a = -1;
b = 1;
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B))];
Create an exact RBFN
% choose a spread constant
spread = 1;
% create a neural network
net = newrbe(P,T,spread);
% view network
view(net)
Page 51 of 91
Warning: Rank deficient, rank = 124, tol = 8.881784e-14.
Evaluate network performance
% simulate a network on training data
Y = net(P);
% calculate [%] of correct classifications
correct = 100 * length(find(T.*Y > 0)) / length(T);
fprintf('nSpread = %.2fn',spread)
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response
figure;
plot(T')
hold on
grid on
plot(Y','r')
ylim([-2 2])
set(gca,'ytick',[-2 0 2])
legend('Targets','Network response')
xlabel('Sample No.')
Spread = 1.00
Num of neurons = 400
Correct class = 100.00 %
Page 52 of 91
Plot classification result
% generate a grid
span = -1:.025:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
Page 53 of 91
Plot RBFN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs')
Page 54 of 91
Published with MATLAB® 7.14
Page 55 of 91
Classification of XOR problem with a RBFN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to
define a neural network for solving the XOR classification problem.
Contents
q Create input data
q Define output coding
q Prepare inputs & outputs for network training
q Create a RBFN
q Evaluate network performance
q Plot classification result
q Plot RBFN centers
Create input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 100;
% offset of clusters
q = .6;
% define 2 groups of input data
A = [rand(1,K)-q rand(1,K)+q;
rand(1,K)+q rand(1,K)-q];
B = [rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)-q];
% plot data
plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*')
grid on
hold on
Page 56 of 91
Define output coding
% coding (+1/-1) for 2-class XOR problem
a = -1;
b = 1;
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B))];
Create a RBFN
% NEWRB algorithm
% The following steps are repeated until the network's mean squared error
% falls below goal:
% 1. The network is simulated
% 2. The input vector with the greatest error is found
% 3. A radbas neuron is added with weights equal to that vector
% 4. The purelin layer weights are redesigned to minimize error
% choose a spread constant
Page 57 of 91
spread = 2;
% choose max number of neurons
K = 20;
% performance goal (SSE)
goal = 0;
% number of neurons to add between displays
Ki = 4;
% create a neural network
net = newrb(P,T,goal,spread,K,Ki);
% view network
view(net)
NEWRB, neurons = 0, MSE = 1
NEWRB, neurons = 4, MSE = 0.302296
NEWRB, neurons = 8, MSE = 0.221059
NEWRB, neurons = 12, MSE = 0.193983
NEWRB, neurons = 16, MSE = 0.154859
NEWRB, neurons = 20, MSE = 0.122332
Page 58 of 91
Evaluate network performance
% simulate RBFN on training data
Y = net(P);
% calculate [%] of correct classifications
correct = 100 * length(find(T.*Y > 0)) / length(T);
fprintf('nSpread = %.2fn',spread)
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response
figure;
plot(T')
hold on
grid on
plot(Y','r')
ylim([-2 2])
set(gca,'ytick',[-2 0 2])
legend('Targets','Network response')
xlabel('Sample No.')
Spread = 2.00
Num of neurons = 20
Correct class = 99.50 %
Page 59 of 91
Plot classification result
% generate a grid
span = -1:.025:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
Page 60 of 91
Plot RBFN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs')
Page 61 of 91
Published with MATLAB® 7.14
Page 62 of 91
Classification of XOR problem with a PNN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to
define a neural network for solving the XOR classification problem.
Contents
q Create input data
q Define output coding
q Prepare inputs & outputs for network training
q Create a PNN
q Evaluate network performance
q Plot classification result for the complete input space
q plot PNN centers
Create input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 100;
% offset of clusters
q = .6;
% define 2 groups of input data
A = [rand(1,K)-q rand(1,K)+q;
rand(1,K)+q rand(1,K)-q];
B = [rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)-q];
% plot data
plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*')
grid on
hold on
Page 63 of 91
Define output coding
% coding (+1/-1) for 2-class XOR problem
a = 1;
b = 2;
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B))];
Create a PNN
% choose a spread constant
spread = .5;
% create a neural network
net = newpnn(P,ind2vec(T),spread);
% view network
view(net)
Page 64 of 91
Evaluate network performance
% simulate RBFN on training data
Y = net(P);
Y = vec2ind(Y);
% calculate [%] of correct classifications
correct = 100 * length(find(T==Y)) / length(T);
fprintf('nSpread = %.2fn',spread)
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response
figure;
plot(T')
hold on
grid on
plot(Y','r--')
ylim([0 3])
set(gca,'ytick',[-2 0 2])
legend('Targets','Network response')
xlabel('Sample No.')
Spread = 0.50
Num of neurons = 400
Correct class = 100.00 %
Page 65 of 91
Plot classification result for the complete input space
% generate a grid
span = -1:.025:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
aa = vec2ind(aa)-1.5; % convert
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
Page 66 of 91
plot PNN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs')
Page 67 of 91
Published with MATLAB® 7.14
Page 68 of 91
Classification of XOR problem with a GRNN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to
define a neural network for solving the XOR classification problem.
Contents
q Create input data
q Define output coding
q Prepare inputs & outputs for network training
q Create a GRNN
q Evaluate network performance
q Plot classification result
q plot GRNN centers
Create input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 100;
% offset of clusters
q = .6;
% define 2 groups of input data
A = [rand(1,K)-q rand(1,K)+q;
rand(1,K)+q rand(1,K)-q];
B = [rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)-q];
% plot data
plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*')
grid on
hold on
Page 69 of 91
Define output coding
% coding (+1/-1) for 2-class XOR problem
a = -1;
b = 1;
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B))];
Create a GRNN
% choose a spread constant
spread = .2;
% create a neural network
net = newgrnn(P,T,spread);
% view network
view(net)
Page 70 of 91
Evaluate network performance
% simulate GRNN on training data
Y = net(P);
% calculate [%] of correct classifications
correct = 100 * length(find(T.*Y > 0)) / length(T);
fprintf('nSpread = %.2fn',spread)
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response
figure;
plot(T')
hold on
grid on
plot(Y','r')
ylim([-2 2])
set(gca,'ytick',[-2 0 2])
legend('Targets','Network response')
xlabel('Sample No.')
Spread = 0.20
Num of neurons = 400
Correct class = 100.00 %
Page 71 of 91
Plot classification result
% generate a grid
span = -1:.025:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
Page 72 of 91
plot GRNN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs')
Page 73 of 91
Published with MATLAB® 7.14
Page 74 of 91
Bayesian regularization for RBFN
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a
neural network for solving the XOR classification problem.
Contents
q Create input data
q Define output coding
q Prepare inputs & outputs for network training
q Create a RBFN
q Evaluate network performance
q Plot classification result
q Retrain a RBFN using Bayesian regularization backpropagation
q Evaluate network performance after Bayesian regularization training
q Plot classification result after Bayesian regularization training
Create input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 100;
% offset of clusters
q = .6;
% define 2 groups of input data
A = [rand(1,K)-q rand(1,K)+q;
rand(1,K)+q rand(1,K)-q];
B = [rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)-q];
% plot data
plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*')
grid on
hold on
Page 75 of 91
Define output coding
% coding (+1/-1) for 2-class XOR problem
a = -1;
b = 1;
Prepare inputs & outputs for network training
% define inputs (combine samples from all four classes)
P = [A B];
% define targets
T = [repmat(a,1,length(A)) repmat(b,1,length(B))];
Create a RBFN
% choose a spread constant
spread = .1;
% choose max number of neurons
K = 10;
% performance goal (SSE)
goal = 0;
% number of neurons to add between displays
Ki = 2;
% create a neural network
net = newrb(P,T,goal,spread,K,Ki);
% view network
Page 76 of 91
view(net)
NEWRB, neurons = 0, MSE = 1
NEWRB, neurons = 2, MSE = 0.928277
NEWRB, neurons = 4, MSE = 0.855829
NEWRB, neurons = 6, MSE = 0.798564
NEWRB, neurons = 8, MSE = 0.742854
NEWRB, neurons = 10, MSE = 0.690962
Evaluate network performance
% check RBFN spread
actual_spread = net.b{1}
% simulate RBFN on training data
Y = net(P);
Page 77 of 91
% calculate [%] of correct classifications
correct = 100 * length(find(T.*Y > 0)) / length(T);
fprintf('nSpread = %.2fn',spread)
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response to see how good the network learns the data
figure;
plot(T')
ylim([-2 2])
set(gca,'ytick',[-2 0 2])
hold on
grid on
plot(Y','r')
legend('Targets','Network response')
xlabel('Sample No.')
actual_spread =
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
8.3255
Spread = 0.10
Num of neurons = 10
Correct class = 79.50 %
Page 78 of 91
Plot classification result
% generate a grid
span = -1:.025:2;
[P1,P2] = meshgrid(span,span);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
% plot RBFN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs')
Page 79 of 91
Retrain a RBFN using Bayesian regularization backpropagation
% define custom training function: Bayesian regularization backpropagation
net.trainFcn='trainbr';
% perform Levenberg-Marquardt training with Bayesian regularization
net = train(net,P,T);
Evaluate network performance after Bayesian regularization training
% check new RBFN spread
spread_after_training = net.b{1}
% simulate RBFN on training data
Y = net(P);
% calculate [%] of correct classifications
correct = 100 * length(find(T.*Y > 0)) / length(T);
fprintf('Num of neurons = %dn',net.layers{1}.size)
fprintf('Correct class = %.2f %%n',correct)
% plot targets and network response
figure;
plot(T')
ylim([-2 2])
set(gca,'ytick',[-2 0 2])
hold on
grid on
plot(Y','r')
legend('Targets','Network response')
Page 80 of 91
xlabel('Sample No.')
spread_after_training =
2.9924
3.0201
0.7809
0.5933
2.6968
2.8934
2.2121
2.9748
2.7584
3.5739
Num of neurons = 10
Correct class = 100.00 %
Plot classification result after Bayesian regularization training
% simulate neural network on a grid
aa = sim(net,pp);
% plot classification regions based on MAX activation
figure(1)
ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5);
mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5);
set(ma,'facecolor',[1 0.2 .7],'linestyle','none');
set(mb,'facecolor',[1 1.0 .5],'linestyle','none');
view(2)
Page 81 of 91
% Plot modified RBFN centers
plot(net.iw{1}(:,1),net.iw{1}(:,2),'rs','linewidth',2)
Published with MATLAB® 7.14
Page 82 of 91
1D and 2D Self Organized Map
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Define 1-dimensional and 2-dimensional SOM networks to represent the 2-dimensional input space.
Contents
q Define 4 clusters of input data
q Create and train 1D-SOM
q plot 1D-SOM results
q Create and train 2D-SOM
q plot 2D-SOM results
Define 4 clusters of input data
close all, clear all, clc, format compact
% number of samples of each cluster
K = 200;
% offset of classes
q = 1.1;
% define 4 clusters of input data
P = [rand(1,K)-q rand(1,K)+q rand(1,K)+q rand(1,K)-q;
rand(1,K)+q rand(1,K)+q rand(1,K)-q rand(1,K)-q];
% plot clusters
plot(P(1,:),P(2,:),'g.')
hold on
grid on
Page 83 of 91
Create and train 1D-SOM
% SOM parameters
dimensions = [100];
coverSteps = 100;
initNeighbor = 10;
topologyFcn = 'gridtop';
distanceFcn = 'linkdist';
% define net
net1 = selforgmap(dimensions,coverSteps,initNeighbor,topologyFcn,distanceFcn);
% train
[net1,Y] = train(net1,P);
plot 1D-SOM results
% plot input data and SOM weight positions
plotsompos(net1,P);
grid on
Page 84 of 91
Create and train 2D-SOM
% SOM parameters
dimensions = [10 10];
coverSteps = 100;
initNeighbor = 4;
topologyFcn = 'hextop';
distanceFcn = 'linkdist';
% define net
net2 = selforgmap(dimensions,coverSteps,initNeighbor,topologyFcn,distanceFcn);
% train
[net2,Y] = train(net2,P);
plot 2D-SOM results
% plot input data and SOM weight positions
plotsompos(net2,P);
grid on
% plot SOM neighbor distances
plotsomnd(net2)
% plot for each SOM neuron the number of input vectors that it classifies
figure
plotsomhits(net2,P)
Page 85 of 91
Page 86 of 91
Published with MATLAB® 7.14
Page 87 of 91
PCA for industrial diagnostic of compressor connection rod defects
Neural Networks course (practical examples) © 2012 Primoz Potocnik
PROBLEM DESCRIPTION: Industrial production of compressors suffers from problems during the imprinting operation where a connection rod is connected with a
compressor head. Irregular imprinting can cause damage or crack of the connection rod which results in damaged compressor. Such compressors should be eliminated from
the production line but defects of this type are difficult to detect. The task is to detect crack and overload defects from the measurement of the imprinting force.
Contents
q Photos of the broken connection rod
q Load and plot data
q Prepare inputs by PCA
q Define output coding: 0=OK, 1=Error
q Create and train a multilayer perceptron
q Evaluate network performance
q Plot classification result
Photos of the broken connection rod
Load and plot data
close all, clear all, clc, format compact
% industrial data
load data2.mat
whos
% show data
figure
plot(force(find(target==1),:)','b') % OK (class 1)
grid on, hold on
plot(force(find(target>1),:)','r') % NOT OK (class 2 & 3)
xlabel('Time')
ylabel('Force')
Name Size Bytes Class Attributes
force 2000x100 1600000 double
notes 1x3 222 cell
target 2000x1 16000 double
Page 88 of 91
Prepare inputs by PCA
% 1. Standardize inputs to zero mean, variance one
[pn,ps1] = mapstd(force');
% 2. Apply Principal Compoments Analysis
% inputs whose contribution to total variation are less than maxfrac are removed
FP.maxfrac = 0.1;
% process inputs with principal component analysis
[ptrans,ps2] = processpca(pn, FP);
ps2
% transformed inputs
force2 = ptrans';
whos force force2
% plot data in the space of first 2 PCA components
figure
plot(force2(:,1),force2(:,2),'.') % OK
grid on, hold on
plot(force2(find(target>1),1),force2(find(target>1),2),'r.') % NOT_OK
xlabel('pca1')
ylabel('pca2')
legend('OK','NOT OK','location','nw')
% % plot data in the space of first 3 PCA components
% figure
% plot3(force2(find(target==1),1),force2(find(target==1),2),force2(find(target==1),3),'b.')
% grid on, hold on
% plot3(force2(find(target>1),1),force2(find(target>1),2),force2(find(target>1),3),'r.')
ps2 =
name: 'processpca'
xrows: 100
maxfrac: 0.1000
yrows: 2
transform: [2x100 double]
no_change: 0
Name Size Bytes Class Attributes
force 2000x100 1600000 double
Page 89 of 91
force2 2000x2 32000 double
Define output coding: 0=OK, 1=Error
% binary coding 0/1
target = double(target > 1);
Create and train a multilayer perceptron
% create a neural network
net = feedforwardnet([6 4]);
% set early stopping parameters
net.divideParam.trainRatio = 0.70; % training set [%]
net.divideParam.valRatio = 0.15; % validation set [%]
net.divideParam.testRatio = 0.15; % test set [%]
% train a neural network
[net,tr,Y,E] = train(net,force2',target');
% show net
view(net)
Evaluate network performance
% digitize network response
Page 90 of 91
threshold = 0.5;
Y = double(Y > threshold)';
% find percentage of correct classifications
cc = 100*length(find(Y==target))/length(target);
fprintf('Correct classifications: %.1f [%%]n', cc)
Correct classifications: 99.6 [%]
Plot classification result
figure(2)
a = axis;
% generate a grid, expand input space
xspan = a(1)-10 : .1 : a(2)+10;
yspan = a(3)-10 : .1 : a(4)+10;
[P1,P2] = meshgrid(xspan,yspan);
pp = [P1(:) P2(:)]';
% simualte neural network on a grid
aa = sim(net,pp);
aa = double(aa > threshold);
% plot classification regions based on MAX activation
ma = mesh(P1,P2,reshape(-aa,length(yspan),length(xspan))-4);
mb = mesh(P1,P2,reshape( aa,length(yspan),length(xspan))-5);
set(ma,'facecolor',[.7 1.0 1],'linestyle','none');
set(mb,'facecolor',[1 0.7 1],'linestyle','none');
view(2)
Published with MATLAB® 7.14
Page 91 of 91

More Related Content

What's hot

Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018Universitat Politècnica de Catalunya
 
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...Universitat Politècnica de Catalunya
 
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)Universitat Politècnica de Catalunya
 
Deep Learning, Keras, and TensorFlow
Deep Learning, Keras, and TensorFlowDeep Learning, Keras, and TensorFlow
Deep Learning, Keras, and TensorFlowOswald Campesato
 
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018Universitat Politècnica de Catalunya
 
(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2Serhii Havrylov
 
RNN, LSTM and Seq-2-Seq Models
RNN, LSTM and Seq-2-Seq ModelsRNN, LSTM and Seq-2-Seq Models
RNN, LSTM and Seq-2-Seq ModelsEmory NLP
 
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...Universitat Politècnica de Catalunya
 
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...Universitat Politècnica de Catalunya
 
Java and Deep Learning (Introduction)
Java and Deep Learning (Introduction)Java and Deep Learning (Introduction)
Java and Deep Learning (Introduction)Oswald Campesato
 
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)Universitat Politècnica de Catalunya
 
Deep Learning: R with Keras and TensorFlow
Deep Learning: R with Keras and TensorFlowDeep Learning: R with Keras and TensorFlow
Deep Learning: R with Keras and TensorFlowOswald Campesato
 
Deep Learning, Scala, and Spark
Deep Learning, Scala, and SparkDeep Learning, Scala, and Spark
Deep Learning, Scala, and SparkOswald Campesato
 
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Skip RNN: Learning to Skip State Updates in Recurrent Neural NetworksSkip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Skip RNN: Learning to Skip State Updates in Recurrent Neural NetworksUniversitat Politècnica de Catalunya
 
Learning Financial Market Data with Recurrent Autoencoders and TensorFlow
Learning Financial Market Data with Recurrent Autoencoders and TensorFlowLearning Financial Market Data with Recurrent Autoencoders and TensorFlow
Learning Financial Market Data with Recurrent Autoencoders and TensorFlowAltoros
 
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)Universitat Politècnica de Catalunya
 
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018Universitat Politècnica de Catalunya
 
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)Universitat Politècnica de Catalunya
 

What's hot (20)

Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
 
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...
Deep Generative Models I (DLAI D9L2 2017 UPC Deep Learning for Artificial Int...
 
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)
The Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intelligence)
 
Deep Learning, Keras, and TensorFlow
Deep Learning, Keras, and TensorFlowDeep Learning, Keras, and TensorFlow
Deep Learning, Keras, and TensorFlow
 
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
 
(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2(Kpi summer school 2015) theano tutorial part2
(Kpi summer school 2015) theano tutorial part2
 
RNN, LSTM and Seq-2-Seq Models
RNN, LSTM and Seq-2-Seq ModelsRNN, LSTM and Seq-2-Seq Models
RNN, LSTM and Seq-2-Seq Models
 
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...
Transfer Learning and Domain Adaptation (DLAI D5L2 2017 UPC Deep Learning for...
 
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
Multilayer Perceptron (DLAI D1L2 2017 UPC Deep Learning for Artificial Intell...
 
Java and Deep Learning (Introduction)
Java and Deep Learning (Introduction)Java and Deep Learning (Introduction)
Java and Deep Learning (Introduction)
 
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)
Optimization (DLAI D4L1 2017 UPC Deep Learning for Artificial Intelligence)
 
Deep Learning: R with Keras and TensorFlow
Deep Learning: R with Keras and TensorFlowDeep Learning: R with Keras and TensorFlow
Deep Learning: R with Keras and TensorFlow
 
C++ and Deep Learning
C++ and Deep LearningC++ and Deep Learning
C++ and Deep Learning
 
Deep Learning, Scala, and Spark
Deep Learning, Scala, and SparkDeep Learning, Scala, and Spark
Deep Learning, Scala, and Spark
 
Scala and Deep Learning
Scala and Deep LearningScala and Deep Learning
Scala and Deep Learning
 
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Skip RNN: Learning to Skip State Updates in Recurrent Neural NetworksSkip RNN: Learning to Skip State Updates in Recurrent Neural Networks
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks
 
Learning Financial Market Data with Recurrent Autoencoders and TensorFlow
Learning Financial Market Data with Recurrent Autoencoders and TensorFlowLearning Financial Market Data with Recurrent Autoencoders and TensorFlow
Learning Financial Market Data with Recurrent Autoencoders and TensorFlow
 
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
Loss functions (DLAI D4L2 2017 UPC Deep Learning for Artificial Intelligence)
 
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018
Reinforcement Learning (Reloaded) - Xavier Giró-i-Nieto - UPC Barcelona 2018
 
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)
Backpropagation (DLAI D3L1 2017 UPC Deep Learning for Artificial Intelligence)
 

Similar to Nn examples

Fixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural NetworksFixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural NetworksIJITE
 
Fixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural NetworksFixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural Networksgerogepatton
 
NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsMohid Nabil
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsParidha Saxena
 
What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)Randa Elanwar
 
# Neural network toolbox
# Neural network toolbox # Neural network toolbox
# Neural network toolbox VineetKumar508
 
Training course lect3
Training course lect3Training course lect3
Training course lect3Noor Dhiya
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Digital signal Processing all matlab code with Lab report
Digital signal Processing all matlab code with Lab report Digital signal Processing all matlab code with Lab report
Digital signal Processing all matlab code with Lab report Alamgir Hossain
 
Nural network ER.Abhishek k. upadhyay
Nural network  ER.Abhishek k. upadhyayNural network  ER.Abhishek k. upadhyay
Nural network ER.Abhishek k. upadhyayabhishek upadhyay
 
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKS
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKSSLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKS
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKSIJCI JOURNAL
 
Pointcuts and Analysis
Pointcuts and AnalysisPointcuts and Analysis
Pointcuts and AnalysisWiwat Ruengmee
 
DSP_Lab_MAnual_-_Final_Edition[1].docx
DSP_Lab_MAnual_-_Final_Edition[1].docxDSP_Lab_MAnual_-_Final_Edition[1].docx
DSP_Lab_MAnual_-_Final_Edition[1].docxParthDoshi66
 
HW2-1_05.doc
HW2-1_05.docHW2-1_05.doc
HW2-1_05.docbutest
 

Similar to Nn examples (20)

Fixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural NetworksFixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural Networks
 
Fixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural NetworksFixed-Point Code Synthesis for Neural Networks
Fixed-Point Code Synthesis for Neural Networks
 
Dsp lab manual
Dsp lab manualDsp lab manual
Dsp lab manual
 
NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximatePrograms
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprograms
 
Ns2
Ns2Ns2
Ns2
 
What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)
 
Neural networks
Neural networksNeural networks
Neural networks
 
Dsp file
Dsp fileDsp file
Dsp file
 
# Neural network toolbox
# Neural network toolbox # Neural network toolbox
# Neural network toolbox
 
signal and system
signal and system signal and system
signal and system
 
Training course lect3
Training course lect3Training course lect3
Training course lect3
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
Function Approx2009
Function Approx2009Function Approx2009
Function Approx2009
 
Digital signal Processing all matlab code with Lab report
Digital signal Processing all matlab code with Lab report Digital signal Processing all matlab code with Lab report
Digital signal Processing all matlab code with Lab report
 
Nural network ER.Abhishek k. upadhyay
Nural network  ER.Abhishek k. upadhyayNural network  ER.Abhishek k. upadhyay
Nural network ER.Abhishek k. upadhyay
 
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKS
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKSSLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKS
SLIDING WINDOW SUM ALGORITHMS FOR DEEP NEURAL NETWORKS
 
Pointcuts and Analysis
Pointcuts and AnalysisPointcuts and Analysis
Pointcuts and Analysis
 
DSP_Lab_MAnual_-_Final_Edition[1].docx
DSP_Lab_MAnual_-_Final_Edition[1].docxDSP_Lab_MAnual_-_Final_Edition[1].docx
DSP_Lab_MAnual_-_Final_Edition[1].docx
 
HW2-1_05.doc
HW2-1_05.docHW2-1_05.doc
HW2-1_05.doc
 

Recently uploaded

Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...RajaP95
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZTE
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxDeepakSakkari2
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxupamatechverse
 

Recently uploaded (20)

Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptx
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptx
 

Nn examples

  • 1. Neural Networks: MATLAB examples Neural Networks course (practical examples) © 2012 Primoz Potocnik Primoz Potocnik University of Ljubljana Faculty of Mechanical Engineering LASIN - Laboratory of Synergetics www.neural.si | primoz.potocnik@fs.uni-lj.si Contents 1. nn02_neuron_output - Calculate the output of a simple neuron 2. nn02_custom_nn - Create and view custom neural networks 3. nn03_perceptron - Classification of linearly separable data with a perceptron 4. nn03_perceptron_network - Classification of a 4-class problem with a 2-neuron perceptron 5. nn03_adaline - ADALINE time series prediction with adaptive linear filter 6. nn04_mlp_xor - Classification of an XOR problem with a multilayer perceptron 7. nn04_mlp_4classes - Classification of a 4-class problem with a multilayer perceptron 8. nn04_technical_diagnostic - Industrial diagnostic of compressor connection rod defects [data2.zip] 9. nn05_narnet - Prediction of chaotic time series with NAR neural network 10. nn06_rbfn_func - Radial basis function networks for function approximation 11. nn06_rbfn_xor - Radial basis function networks for classification of XOR problem 12. nn07_som - 1D and 2D Self Organized Map 13. nn08_tech_diag_pca - PCA for industrial diagnostic of compressor connection rod defects [data2.zip] Page 1 of 91
  • 2. Neuron output Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Calculate the output of a simple neuron Contents q Define neuron parameters q Define input vector q Calculate neuron output q Plot neuron output over the range of inputs Define neuron parameters close all, clear all, clc, format compact % Neuron weights w = [4 -2] % Neuron bias b = -3 % Activation function func = 'tansig' % func = 'purelin' % func = 'hardlim' % func = 'logsig' w = 4 -2 b = -3 func = tansig Define input vector p = [2 3] p = 2 3 Calculate neuron output activation_potential = p*w'+b Page 2 of 91
  • 3. neuron_output = feval(func, activation_potential) activation_potential = -1 neuron_output = -0.7616 Plot neuron output over the range of inputs [p1,p2] = meshgrid(-10:.25:10); z = feval(func, [p1(:) p2(:)]*w'+b ); z = reshape(z,length(p1),length(p2)); plot3(p1,p2,z) grid on xlabel('Input 1') ylabel('Input 2') zlabel('Neuron output') Published with MATLAB® 7.14 Page 3 of 91
  • 4. Custom networks Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Create and view custom neural networks Contents q Define one sample: inputs and outputs q Define and custom network q Define topology and transfer function q Configure network q Train net and calculate neuron output Define one sample: inputs and outputs close all, clear all, clc, format compact inputs = [1:6]' % input vector (6-dimensional pattern) outputs = [1 2]' % corresponding target output vector inputs = 1 2 3 4 5 6 outputs = 1 2 Define and custom network % create network net = network( ... 1, ... % numInputs, number of inputs, 2, ... % numLayers, number of layers [1; 0], ... % biasConnect, numLayers-by-1 Boolean vector, [1; 0], ... % inputConnect, numLayers-by-numInputs Boolean matrix, [0 0; 1 0], ... % layerConnect, numLayers-by-numLayers Boolean matrix [0 1] ... % outputConnect, 1-by-numLayers Boolean vector ); % View network structure view(net); Page 4 of 91
  • 5. Define topology and transfer function % number of hidden layer neurons net.layers{1}.size = 5; % hidden layer transfer function net.layers{1}.transferFcn = 'logsig'; view(net); Configure network net = configure(net,inputs,outputs); view(net); Train net and calculate neuron output Page 5 of 91
  • 6. % initial network response without training initial_output = net(inputs) % network training net.trainFcn = 'trainlm'; net.performFcn = 'mse'; net = train(net,inputs,outputs); % network response after training final_output = net(inputs) initial_output = 0 0 final_output = 1.0000 2.0000 Published with MATLAB® 7.14 Page 6 of 91
  • 7. Classification of linearly separable data with a perceptron Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Two clusters of data, belonging to two classes, are defined in a 2-dimensional input space. Classes are linearly separable. The task is to construct a Perceptron for the classification of data. Contents q Define input and output data q Create and train perceptron q Plot decision boundary Define input and output data close all, clear all, clc, format compact % number of samples of each class N = 20; % define inputs and outputs offset = 5; % offset for second class x = [randn(2,N) randn(2,N)+offset]; % inputs y = [zeros(1,N) ones(1,N)]; % outputs % Plot input samples with PLOTPV (Plot perceptron input/target vectors) figure(1) plotpv(x,y); Page 7 of 91
  • 8. Create and train perceptron net = perceptron; net = train(net,x,y); view(net); Plot decision boundary figure(1) plotpc(net.IW{1},net.b{1}); Page 8 of 91
  • 9. Published with MATLAB® 7.14 Page 9 of 91
  • 10. Classification of a 4-class problem with a perceptron Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Perceptron network with 2-inputs and 2-outputs is trained to classify input vectors into 4 categories Contents q Define data q Prepare inputs & outputs for perceptron training q Create a perceptron q Train a perceptron q How to use trained perceptron Define data close all, clear all, clc, format compact % number of samples of each class K = 30; % define classes q = .6; % offset of classes A = [rand(1,K)-q; rand(1,K)+q]; B = [rand(1,K)+q; rand(1,K)+q]; C = [rand(1,K)+q; rand(1,K)-q]; D = [rand(1,K)-q; rand(1,K)-q]; % plot classes plot(A(1,:),A(2,:),'bs') hold on grid on plot(B(1,:),B(2,:),'r+') plot(C(1,:),C(2,:),'go') plot(D(1,:),D(2,:),'m*') % text labels for classes text(.5-q,.5+2*q,'Class A') text(.5+q,.5+2*q,'Class B') text(.5+q,.5-2*q,'Class C') text(.5-q,.5-2*q,'Class D') % define output coding for classes a = [0 1]'; b = [1 1]'; c = [1 0]'; d = [0 0]'; % % Why this coding doesn't work? % a = [0 0]'; % b = [1 1]'; % d = [0 1]'; Page 10 of 91
  • 11. % c = [1 0]'; % % Why this coding doesn't work? % a = [0 1]'; % b = [1 1]'; % d = [1 0]'; % c = [0 1]'; Prepare inputs & outputs for perceptron training % define inputs (combine samples from all four classes) P = [A B C D]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ... repmat(c,1,length(C)) repmat(d,1,length(D)) ]; %plotpv(P,T); Create a perceptron net = perceptron; Train a perceptron ADAPT returns a new network object that performs as a better classifier, the network output, and the error. This loop allows the network to adapt for xx passes, plots the classification line, and continues until the error is zero. Page 11 of 91
  • 12. E = 1; net.adaptParam.passes = 1; linehandle = plotpc(net.IW{1},net.b{1}); n = 0; while (sse(E) & n<1000) n = n+1; [net,Y,E] = adapt(net,P,T); linehandle = plotpc(net.IW{1},net.b{1},linehandle); drawnow; end % show perceptron structure view(net); Page 12 of 91
  • 13. How to use trained perceptron % For example, classify an input vector of [0.7; 1.2] p = [0.7; 1.2] y = net(p) % compare response with output coding (a,b,c,d) p = 0.7000 1.2000 y = 1 1 Published with MATLAB® 7.14 Page 13 of 91
  • 14. ADALINE time series prediction Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Construct an ADALINE for adaptive prediction of time series based on past time series data Contents q Define input and output data q Prepare data for neural network toolbox q Define ADALINE neural network q Adaptive learning of the ADALINE q Plot results Define input and output data close all, clear all, clc, format compact % define segments of time vector dt = 0.01; % time step [seconds] t1 = 0 : dt : 3; % first time vector [seconds] t2 = 3+dt : dt : 6; % second time vector [seconds] t = [t1 t2]; % complete time vector [seconds] % define signal y = [sin(4.1*pi*t1) .8*sin(8.3*pi*t2)]; % plot signal plot(t,y,'.-') xlabel('Time [sec]'); ylabel('Target Signal'); grid on ylim([-1.2 1.2]) Page 14 of 91
  • 15. Prepare data for neural network toolbox % There are two basic types of input vectors: those that occur concurrently % (at the same time, or in no particular time sequence), and those that % occur sequentially in time. For concurrent vectors, the order is not % important, and if there were a number of networks running in parallel, % you could present one input vector to each of the networks. For % sequential vectors, the order in which the vectors appear is important. p = con2seq(y); Define ADALINE neural network % The resulting network will predict the next value of the target signal % using delayed values of the target. inputDelays = 1:5; % delayed inputs to be used learning_rate = 0.2; % learning rate % define ADALINE net = linearlayer(inputDelays,learning_rate); Adaptive learning of the ADALINE % Given an input sequence with N steps the network is updated as follows. % Each step in the sequence of inputs is presented to the network one at % a time. The network's weight and bias values are updated after each step, Page 15 of 91
  • 16. % before the next step in the sequence is presented. Thus the network is % updated N times. The output signal and the error signal are returned, % along with new network. [net,Y,E] = adapt(net,p,p); % view network structure view(net) % check final network parameters disp('Weights and bias of the ADALINE after adaptation') net.IW{1} net.b{1} Weights and bias of the ADALINE after adaptation ans = 0.7179 0.4229 0.1552 -0.1203 -0.4159 ans = -1.2520e-08 Plot results % transform result vectors Y = seq2con(Y); Y = Y{1}; E = seq2con(E); E = E{1}; % start a new figure figure; % first graph subplot(211) plot(t,y,'b', t,Y,'r--'); legend('Original','Prediction') grid on xlabel('Time [sec]'); ylabel('Target Signal'); ylim([-1.2 1.2]) % second graph subplot(212) plot(t,E,'g'); grid on Page 16 of 91
  • 17. legend('Prediction error') xlabel('Time [sec]'); ylabel('Error'); ylim([-1.2 1.2]) Published with MATLAB® 7.14 Page 17 of 91
  • 18. Solving XOR problem with a multilayer perceptron Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. (A,C) and (B,D) clusters represent XOR classification problem. The task is to define a neural network for solving the XOR problem. Contents q Define 4 clusters of input data q Define output coding for XOR problem q Prepare inputs & outputs for network training q Create and train a multilayer perceptron q plot targets and network response to see how good the network learns the data q Plot classification result for the complete input space Define 4 clusters of input data close all, clear all, clc, format compact % number of samples of each class K = 100; % define 4 clusters of input data q = .6; % offset of classes A = [rand(1,K)-q; rand(1,K)+q]; B = [rand(1,K)+q; rand(1,K)+q]; C = [rand(1,K)+q; rand(1,K)-q]; D = [rand(1,K)-q; rand(1,K)-q]; % plot clusters figure(1) plot(A(1,:),A(2,:),'k+') hold on grid on plot(B(1,:),B(2,:),'bd') plot(C(1,:),C(2,:),'k+') plot(D(1,:),D(2,:),'bd') % text labels for clusters text(.5-q,.5+2*q,'Class A') text(.5+q,.5+2*q,'Class B') text(.5+q,.5-2*q,'Class A') text(.5-q,.5-2*q,'Class B') Page 18 of 91
  • 19. Define output coding for XOR problem % encode clusters a and c as one class, and b and d as another class a = -1; % a | b c = -1; % ------- b = 1; % d | c d = 1; % Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B C D]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ... repmat(c,1,length(C)) repmat(d,1,length(D)) ]; % view inputs |outputs %[P' T'] Create and train a multilayer perceptron % create a neural network net = feedforwardnet([5 3]); % train net net.divideParam.trainRatio = 1; % training set [%] net.divideParam.valRatio = 0; % validation set [%] net.divideParam.testRatio = 0; % test set [%] % train a neural network [net,tr,Y,E] = train(net,P,T); % show network view(net) Page 19 of 91
  • 20. plot targets and network response to see how good the network learns the data figure(2) plot(T','linewidth',2) hold on plot(Y','r--') grid on legend('Targets','Network response','location','best') ylim([-1.25 1.25]) Plot classification result for the complete input space % generate a grid span = -1:.005:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simulate neural network on a grid aa = net(pp); % translate output into [-1,1] %aa = -1 + 2*(aa>0); % plot classification regions figure(1) mesh(P1,P2,reshape(aa,length(span),length(span))-5); colormap cool Page 20 of 91
  • 21. view(2) Published with MATLAB® 7.14 Page 21 of 91
  • 22. Classification of a 4-class problem with a multilayer perceptron Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. The task is to define a neural network for classification of arbitrary point in the 2-dimensional space into one of the classes (A,B,C,D). Contents q Define 4 clusters of input data q Define output coding for all 4 clusters q Prepare inputs & outputs for network training q Create and train a multilayer perceptron q Evaluate network performance and plot results q Plot classification result for the complete input space Define 4 clusters of input data close all, clear all, clc, format compact % number of samples of each class K = 100; % define 4 clusters of input data q = .6; % offset of classes A = [rand(1,K)-q; rand(1,K)+q]; B = [rand(1,K)+q; rand(1,K)+q]; C = [rand(1,K)+q; rand(1,K)-q]; D = [rand(1,K)-q; rand(1,K)-q]; % plot clusters figure(1) plot(A(1,:),A(2,:),'k+') hold on grid on plot(B(1,:),B(2,:),'b*') plot(C(1,:),C(2,:),'kx') plot(D(1,:),D(2,:),'bd') % text labels for clusters text(.5-q,.5+2*q,'Class A') text(.5+q,.5+2*q,'Class B') text(.5+q,.5-2*q,'Class C') text(.5-q,.5-2*q,'Class D') Page 22 of 91
  • 23. Define output coding for all 4 clusters % coding (+1/-1) of 4 separate classes a = [-1 -1 -1 +1]'; b = [-1 -1 +1 -1]'; d = [-1 +1 -1 -1]'; c = [+1 -1 -1 -1]'; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B C D]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B)) ... repmat(c,1,length(C)) repmat(d,1,length(D)) ]; Create and train a multilayer perceptron % create a neural network net = feedforwardnet([4 3]); % train net net.divideParam.trainRatio = 1; % training set [%] net.divideParam.valRatio = 0; % validation set [%] net.divideParam.testRatio = 0; % test set [%] % train a neural network [net,tr,Y,E] = train(net,P,T); % show network view(net) Page 23 of 91
  • 24. Evaluate network performance and plot results % evaluate performance: decoding network response [m,i] = max(T); % target class [m,j] = max(Y); % predicted class N = length(Y); % number of all samples k = 0; % number of missclassified samples if find(i-j), % if there exist missclassified samples k = length(find(i-j)); % get a number of missclassified samples end fprintf('Correct classified samples: %.1f%% samplesn', 100*(N-k)/N) % plot network output figure; subplot(211) plot(T') title('Targets') ylim([-2 2]) grid on subplot(212) plot(Y') title('Network response') xlabel('# sample') ylim([-2 2]) grid on Correct classified samples: 100.0% samples Page 24 of 91
  • 25. Plot classification result for the complete input space % generate a grid span = -1:.01:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = net(pp); % plot classification regions based on MAX activation figure(1) m = mesh(P1,P2,reshape(aa(1,:),length(span),length(span))-5); set(m,'facecolor',[1 0.2 .7],'linestyle','none'); hold on m = mesh(P1,P2,reshape(aa(2,:),length(span),length(span))-5); set(m,'facecolor',[1 1.0 0.5],'linestyle','none'); m = mesh(P1,P2,reshape(aa(3,:),length(span),length(span))-5); set(m,'facecolor',[.4 1.0 0.9],'linestyle','none'); m = mesh(P1,P2,reshape(aa(4,:),length(span),length(span))-5); set(m,'facecolor',[.3 .4 0.5],'linestyle','none'); view(2) Page 25 of 91
  • 26. Published with MATLAB® 7.14 Page 26 of 91
  • 27. Industrial diagnostic of compressor connection rod defects Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Industrial production of compressors suffers from problems during the imprinting operation where a connection rod is connected with a compressor head. Irregular imprinting can cause damage or crack of the connection rod which results in damaged compressor. Such compressors should be eliminated from the production line but defects of this type are difficult to detect. The task is to detect crack and overload defects from the measurement of the imprinting force. Contents q Photos of the broken connection rod q Load and plot data q Prepare inputs: Data resampling q Define binary output coding: 0=OK, 1=Error q Create and train a multilayer perceptron q Evaluate network performance q Application Photos of the broken connection rod Page 27 of 91
  • 28. Load and plot data close all, clear all, clc, format compact % industrial data load data2.mat whos % show data for class 1: OK figure plot(force','c') grid on, hold on plot(force(find(target==1),:)','b') xlabel('Time') ylabel('Force') title(notes{1}) % show data for class 2: Overload figure plot(force','c') grid on, hold on plot(force(find(target==2),:)','r') xlabel('Time') ylabel('Force') title(notes{2}) % show data for class 3: Crack figure plot(force','c') Page 28 of 91
  • 29. grid on, hold on plot(force(find(target==3),:)','m') xlabel('Time') ylabel('Force') title(notes{3}) Name Size Bytes Class Attributes force 2000x100 1600000 double notes 1x3 222 cell target 2000x1 16000 double Page 29 of 91
  • 30. Prepare inputs: Data resampling Page 30 of 91
  • 31. % include only every step-th data step = 10; force = force(:,1:step:size(force,2)); whos % show resampled data for class 1: OK figure plot(force','c') grid on, hold on plot(force(find(target==1),:)','b') xlabel('Time') ylabel('Force') title([notes{1} ' (resampled data)']) % show resampled data for class 2: Overload figure plot(force','c') grid on, hold on plot(force(find(target==2),:)','r') xlabel('Time') ylabel('Force') title([notes{2} ' (resampled data)']) % show resampled data for class 3: Crack figure plot(force','c') grid on, hold on plot(force(find(target==3),:)','m') xlabel('Time') ylabel('Force') title([notes{3} ' (resampled data)']) Name Size Bytes Class Attributes force 2000x10 160000 double notes 1x3 222 cell step 1x1 8 double target 2000x1 16000 double Page 31 of 91
  • 33. Define binary output coding: 0=OK, 1=Error % binary coding 0/1 target = double(target > 1); Create and train a multilayer perceptron % create a neural network net = feedforwardnet([4]); % set early stopping parameters net.divideParam.trainRatio = 0.70; % training set [%] net.divideParam.valRatio = 0.15; % validation set [%] net.divideParam.testRatio = 0.15; % test set [%] % train a neural network [net,tr,Y,E] = train(net,force',target'); Evaluate network performance % digitize network response threshold = 0.5; Y = double(Y > threshold)'; Page 33 of 91
  • 34. % find percentage of correct classifications correct_classifications = 100*length(find(Y==target))/length(target) correct_classifications = 99.7500 Application % get sample random_index = randi(length(force)) sample = force(random_index,:); % plot sample figure plot(force','c') grid on, hold on plot(sample,'b') xlabel('Time') ylabel('Force') % predict quality q = net(sample'); % digitize network response q = double(q > threshold)' % comment network respons if q==target(random_index) title(sprintf('Quality: %d (correct network response)',q)) else title(sprintf('Quality: %d (wrong network response)',q)) end random_index = 1881 q = 0 Page 34 of 91
  • 35. Published with MATLAB® 7.14 Page 35 of 91
  • 36. Prediction of chaotic time series with NAR neural network Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Design a neural network for the recursive prediction of chaotic Mackay-Glass time series, try various network architectures and experiment with various delays. Contents q Generate data (Mackay-Glass time series) q Define nonlinear autoregressive neural network q Prepare input and target time series data for network training q Train net q Transform network into a closed-loop NAR network q Recursive prediction on validation data Generate data (Mackay-Glass time series) close all, clear all, clc, format compact % data settings N = 700; % number of samples Nu = 300; % number of learning samples % Mackay-Glass time series b = 0.1; c = 0.2; tau = 17; % initialization y = [0.9697 0.9699 0.9794 1.0003 1.0319 1.0703 1.1076 1.1352 1.1485 ... 1.1482 1.1383 1.1234 1.1072 1.0928 1.0820 1.0756 1.0739 1.0759]'; % generate Mackay-Glass time series for n=18:N+99 y(n+1) = y(n) - b*y(n) + c*y(n-tau)/(1+y(n-tau).^10); end % remove initial values y(1:100) = []; % plot training and validation data plot(y,'m-') grid on, hold on plot(y(1:Nu),'b') plot(y,'+k','markersize',2) legend('validation data','training data','sampling markers','location','southwest') xlabel('time (steps)') ylabel('y') ylim([-.5 1.5]) set(gcf,'position',[1 60 800 400]) % prepare training data yt = con2seq(y(1:Nu)'); % prepare validation data yv = con2seq(y(Nu+1:end)'); Page 36 of 91
  • 37. Define nonlinear autoregressive neural network %---------- network parameters ------------- % good parameters (you don't know 'tau' for unknown process) inputDelays = 1:6:19; % input delay vector hiddenSizes = [6 3]; % network structure (number of neurons) %------------------------------------- % nonlinear autoregressive neural network net = narnet(inputDelays, hiddenSizes); Prepare input and target time series data for network training % [Xs,Xi,Ai,Ts,EWs,shift] = preparets(net,Xnf,Tnf,Tf,EW) % % This function simplifies the normally complex and error prone task of % reformatting input and target timeseries. It automatically shifts input % and target time series as many steps as are needed to fill the initial % input and layer delay states. If the network has open loop feedback, % then it copies feedback targets into the inputs as needed to define the % open loop inputs. % % net : Neural network % Xnf : Non-feedback inputs % Tnf : Non-feedback targets % Tf : Feedback targets % EW : Error weights (default = {1}) % % Xs : Shifted inputs % Xi : Initial input delay states % Ai : Initial layer delay states % Ts : Shifted targets [Xs,Xi,Ai,Ts] = preparets(net,{},{},yt); Train net % train net with prepared training data net = train(net,Xs,Ts,Xi,Ai); % view trained net view(net) Page 37 of 91
  • 38. Transform network into a closed-loop NAR network % close feedback for recursive prediction net = closeloop(net); % view closeloop version of a net view(net); Recursive prediction on validation data % prepare validation data for network simulation yini = yt(end-max(inputDelays)+1:end); % initial values from training data % combine initial values and validation data 'yv' [Xs,Xi,Ai] = preparets(net,{},{},[yini yv]); % predict on validation data predict = net(Xs,Xi,Ai); % validation data Yv = cell2mat(yv); % prediction Yp = cell2mat(predict); % error e = Yv - Yp; % plot results of recursive simulation figure(1) plot(Nu+1:N,Yp,'r') plot(Nu+1:N,e,'g') legend('validation data','training data','sampling markers',... 'prediction','error','location','southwest') Page 38 of 91
  • 39. Published with MATLAB® 7.14 Page 39 of 91
  • 40. Function approximation with RBFN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Create a function approximation model based on a measured data set. Apply various Neural Network architectures based on Radial Basis Functions. Compare with Multilayer perceptron and Linear regression models. Contents q Linear Regression q Exact RBFN q RBFN q GRNN q RBFN trained by Bayesian regularization q MLP q Data generator Linear Regression close all, clear all, clc, format compact % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------------------------------- % no hidden layers net = feedforwardnet([]); % % one hidden layer with linear transfer functions % net = feedforwardnet([10]); % net.layers{1}.transferFcn = 'purelin'; % set early stopping parameters net.divideParam.trainRatio = 1.0; % training set [%] net.divideParam.valRatio = 0.0; % validation set [%] net.divideParam.testRatio = 0.0; % test set [%] % train a neural network net.trainParam.epochs = 200; net = train(net,Xtrain,Ytrain); %--------------------------------- % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'color',[1 .4 0]) legend('original function','available data','Linear regression','location','northwest') Page 40 of 91
  • 41. Exact RBFN % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------------------------------- % choose a spread constant spread = .4; % create a neural network net = newrbe(Xtrain,Ytrain,spread); %--------------------------------- % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'r') legend('original function','available data','Exact RBFN','location','northwest') Warning: Rank deficient, rank = 53, tol = 1.110223e-13. Page 41 of 91
  • 42. RBFN % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------------------------------- % choose a spread constant spread = .2; % choose max number of neurons K = 40; % performance goal (SSE) goal = 0; % number of neurons to add between displays Ki = 5; % create a neural network net = newrb(Xtrain,Ytrain,goal,spread,K,Ki); %--------------------------------- % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'r') legend('original function','available data','RBFN','location','northwest') NEWRB, neurons = 0, MSE = 333.938 NEWRB, neurons = 5, MSE = 47.271 NEWRB, neurons = 10, MSE = 12.3371 NEWRB, neurons = 15, MSE = 9.26908 NEWRB, neurons = 20, MSE = 4.16992 NEWRB, neurons = 25, MSE = 2.82444 NEWRB, neurons = 30, MSE = 2.43353 NEWRB, neurons = 35, MSE = 2.06149 NEWRB, neurons = 40, MSE = 1.94627 Page 42 of 91
  • 43. GRNN % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------------------------------- % choose a spread constant Page 43 of 91
  • 44. spread = .12; % create a neural network net = newgrnn(Xtrain,Ytrain,spread); %--------------------------------- % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'r') legend('original function','available data','RBFN','location','northwest') RBFN trained by Bayesian regularization % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------- RBFN ------------------ % choose a spread constant spread = .2; % choose max number of neurons K = 20; % performance goal (SSE) goal = 0; % number of neurons to add between displays Ki = 20; % create a neural network net = newrb(Xtrain,Ytrain,goal,spread,K,Ki); %--------------------------------- Page 44 of 91
  • 45. % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'r') % Show RBFN centers c = net.iw{1}; plot(c,zeros(size(c)),'rs') legend('original function','available data','RBFN','centers','location','northwest') %--------- trainbr --------------- % Retrain a RBFN using Bayesian regularization backpropagation net.trainFcn='trainbr'; net.trainParam.epochs = 100; % perform Levenberg-Marquardt training with Bayesian regularization net = train(net,Xtrain,Ytrain); %--------------------------------- % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'m') % Show RBFN centers c = net.iw{1}; plot(c,ones(size(c)),'ms') legend('original function','available data','RBFN','centers','RBFN + trainbr','new centers','location','northwest') NEWRB, neurons = 0, MSE = 334.852 NEWRB, neurons = 20, MSE = 4.34189 Page 45 of 91
  • 46. MLP % generate data [X,Xtrain,Ytrain,fig] = data_generator(); %--------------------------------- % create a neural network net = feedforwardnet([12 6]); % set early stopping parameters net.divideParam.trainRatio = 1.0; % training set [%] net.divideParam.valRatio = 0.0; % validation set [%] net.divideParam.testRatio = 0.0; % test set [%] % train a neural network net.trainParam.epochs = 200; net = train(net,Xtrain,Ytrain); %--------------------------------- % view net view (net) % simulate a network over complete input range Y = net(X); % plot network response figure(fig) plot(X,Y,'color',[1 .4 0]) legend('original function','available data','MLP','location','northwest') Page 46 of 91
  • 47. Data generator type data_generator %% Data generator function function [X,Xtrain,Ytrain,fig] = data_generator() % data generator X = 0.01:.01:10; f = abs(besselj(2,X*7).*asind(X/2) + (X.^1.95)) + 2; fig = figure; plot(X,f,'b-') hold on grid on % available data points Ytrain = f + 5*(rand(1,length(f))-.5); Xtrain = X([181:450 601:830]); Ytrain = Ytrain([181:450 601:830]); plot(Xtrain,Ytrain,'kx') xlabel('x') ylabel('y') ylim([0 100]) legend('original function','available data','location','northwest') Published with MATLAB® 7.14 Page 47 of 91
  • 49. Radial Basis Function Networks for Classification of XOR problem Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 4 clusters of data (A,B,C,D) are defined in a 2-dimensional input space. (A,C) and (B,D) clusters represent XOR classification problem. The task is to define a neural network for solving the XOR problem. Contents 1. Classification of XOR problem with an exact RBFN 2. Classification of XOR problem with a RBFN 3. Classification of XOR problem with a PNN 4. Classification of XOR problem with a GRNN 5. Bayesian regularization for RBFN Page 49 of 91
  • 50. Classification of XOR problem with an exact RBFN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a neural network for solving the XOR classification problem. Contents q Create input data q Define output coding q Prepare inputs & outputs for network training q Create an exact RBFN q Evaluate network performance q Plot classification result q Plot RBFN centers Create input data close all, clear all, clc, format compact % number of samples of each cluster K = 100; % offset of clusters q = .6; % define 2 groups of input data A = [rand(1,K)-q rand(1,K)+q; rand(1,K)+q rand(1,K)-q]; B = [rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)-q]; % plot data plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*') grid on hold on Page 50 of 91
  • 51. Define output coding % coding (+1/-1) for 2-class XOR problem a = -1; b = 1; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B))]; Create an exact RBFN % choose a spread constant spread = 1; % create a neural network net = newrbe(P,T,spread); % view network view(net) Page 51 of 91
  • 52. Warning: Rank deficient, rank = 124, tol = 8.881784e-14. Evaluate network performance % simulate a network on training data Y = net(P); % calculate [%] of correct classifications correct = 100 * length(find(T.*Y > 0)) / length(T); fprintf('nSpread = %.2fn',spread) fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response figure; plot(T') hold on grid on plot(Y','r') ylim([-2 2]) set(gca,'ytick',[-2 0 2]) legend('Targets','Network response') xlabel('Sample No.') Spread = 1.00 Num of neurons = 400 Correct class = 100.00 % Page 52 of 91
  • 53. Plot classification result % generate a grid span = -1:.025:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) Page 53 of 91
  • 55. Published with MATLAB® 7.14 Page 55 of 91
  • 56. Classification of XOR problem with a RBFN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a neural network for solving the XOR classification problem. Contents q Create input data q Define output coding q Prepare inputs & outputs for network training q Create a RBFN q Evaluate network performance q Plot classification result q Plot RBFN centers Create input data close all, clear all, clc, format compact % number of samples of each cluster K = 100; % offset of clusters q = .6; % define 2 groups of input data A = [rand(1,K)-q rand(1,K)+q; rand(1,K)+q rand(1,K)-q]; B = [rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)-q]; % plot data plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*') grid on hold on Page 56 of 91
  • 57. Define output coding % coding (+1/-1) for 2-class XOR problem a = -1; b = 1; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B))]; Create a RBFN % NEWRB algorithm % The following steps are repeated until the network's mean squared error % falls below goal: % 1. The network is simulated % 2. The input vector with the greatest error is found % 3. A radbas neuron is added with weights equal to that vector % 4. The purelin layer weights are redesigned to minimize error % choose a spread constant Page 57 of 91
  • 58. spread = 2; % choose max number of neurons K = 20; % performance goal (SSE) goal = 0; % number of neurons to add between displays Ki = 4; % create a neural network net = newrb(P,T,goal,spread,K,Ki); % view network view(net) NEWRB, neurons = 0, MSE = 1 NEWRB, neurons = 4, MSE = 0.302296 NEWRB, neurons = 8, MSE = 0.221059 NEWRB, neurons = 12, MSE = 0.193983 NEWRB, neurons = 16, MSE = 0.154859 NEWRB, neurons = 20, MSE = 0.122332 Page 58 of 91
  • 59. Evaluate network performance % simulate RBFN on training data Y = net(P); % calculate [%] of correct classifications correct = 100 * length(find(T.*Y > 0)) / length(T); fprintf('nSpread = %.2fn',spread) fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response figure; plot(T') hold on grid on plot(Y','r') ylim([-2 2]) set(gca,'ytick',[-2 0 2]) legend('Targets','Network response') xlabel('Sample No.') Spread = 2.00 Num of neurons = 20 Correct class = 99.50 % Page 59 of 91
  • 60. Plot classification result % generate a grid span = -1:.025:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) Page 60 of 91
  • 62. Published with MATLAB® 7.14 Page 62 of 91
  • 63. Classification of XOR problem with a PNN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a neural network for solving the XOR classification problem. Contents q Create input data q Define output coding q Prepare inputs & outputs for network training q Create a PNN q Evaluate network performance q Plot classification result for the complete input space q plot PNN centers Create input data close all, clear all, clc, format compact % number of samples of each cluster K = 100; % offset of clusters q = .6; % define 2 groups of input data A = [rand(1,K)-q rand(1,K)+q; rand(1,K)+q rand(1,K)-q]; B = [rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)-q]; % plot data plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*') grid on hold on Page 63 of 91
  • 64. Define output coding % coding (+1/-1) for 2-class XOR problem a = 1; b = 2; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B))]; Create a PNN % choose a spread constant spread = .5; % create a neural network net = newpnn(P,ind2vec(T),spread); % view network view(net) Page 64 of 91
  • 65. Evaluate network performance % simulate RBFN on training data Y = net(P); Y = vec2ind(Y); % calculate [%] of correct classifications correct = 100 * length(find(T==Y)) / length(T); fprintf('nSpread = %.2fn',spread) fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response figure; plot(T') hold on grid on plot(Y','r--') ylim([0 3]) set(gca,'ytick',[-2 0 2]) legend('Targets','Network response') xlabel('Sample No.') Spread = 0.50 Num of neurons = 400 Correct class = 100.00 % Page 65 of 91
  • 66. Plot classification result for the complete input space % generate a grid span = -1:.025:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); aa = vec2ind(aa)-1.5; % convert % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) Page 66 of 91
  • 68. Published with MATLAB® 7.14 Page 68 of 91
  • 69. Classification of XOR problem with a GRNN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a neural network for solving the XOR classification problem. Contents q Create input data q Define output coding q Prepare inputs & outputs for network training q Create a GRNN q Evaluate network performance q Plot classification result q plot GRNN centers Create input data close all, clear all, clc, format compact % number of samples of each cluster K = 100; % offset of clusters q = .6; % define 2 groups of input data A = [rand(1,K)-q rand(1,K)+q; rand(1,K)+q rand(1,K)-q]; B = [rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)-q]; % plot data plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*') grid on hold on Page 69 of 91
  • 70. Define output coding % coding (+1/-1) for 2-class XOR problem a = -1; b = 1; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B))]; Create a GRNN % choose a spread constant spread = .2; % create a neural network net = newgrnn(P,T,spread); % view network view(net) Page 70 of 91
  • 71. Evaluate network performance % simulate GRNN on training data Y = net(P); % calculate [%] of correct classifications correct = 100 * length(find(T.*Y > 0)) / length(T); fprintf('nSpread = %.2fn',spread) fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response figure; plot(T') hold on grid on plot(Y','r') ylim([-2 2]) set(gca,'ytick',[-2 0 2]) legend('Targets','Network response') xlabel('Sample No.') Spread = 0.20 Num of neurons = 400 Correct class = 100.00 % Page 71 of 91
  • 72. Plot classification result % generate a grid span = -1:.025:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) Page 72 of 91
  • 74. Published with MATLAB® 7.14 Page 74 of 91
  • 75. Bayesian regularization for RBFN Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: 2 groups of linearly inseparable data (A,B) are defined in a 2-dimensional input space. The task is to define a neural network for solving the XOR classification problem. Contents q Create input data q Define output coding q Prepare inputs & outputs for network training q Create a RBFN q Evaluate network performance q Plot classification result q Retrain a RBFN using Bayesian regularization backpropagation q Evaluate network performance after Bayesian regularization training q Plot classification result after Bayesian regularization training Create input data close all, clear all, clc, format compact % number of samples of each cluster K = 100; % offset of clusters q = .6; % define 2 groups of input data A = [rand(1,K)-q rand(1,K)+q; rand(1,K)+q rand(1,K)-q]; B = [rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)-q]; % plot data plot(A(1,:),A(2,:),'k+',B(1,:),B(2,:),'b*') grid on hold on Page 75 of 91
  • 76. Define output coding % coding (+1/-1) for 2-class XOR problem a = -1; b = 1; Prepare inputs & outputs for network training % define inputs (combine samples from all four classes) P = [A B]; % define targets T = [repmat(a,1,length(A)) repmat(b,1,length(B))]; Create a RBFN % choose a spread constant spread = .1; % choose max number of neurons K = 10; % performance goal (SSE) goal = 0; % number of neurons to add between displays Ki = 2; % create a neural network net = newrb(P,T,goal,spread,K,Ki); % view network Page 76 of 91
  • 77. view(net) NEWRB, neurons = 0, MSE = 1 NEWRB, neurons = 2, MSE = 0.928277 NEWRB, neurons = 4, MSE = 0.855829 NEWRB, neurons = 6, MSE = 0.798564 NEWRB, neurons = 8, MSE = 0.742854 NEWRB, neurons = 10, MSE = 0.690962 Evaluate network performance % check RBFN spread actual_spread = net.b{1} % simulate RBFN on training data Y = net(P); Page 77 of 91
  • 78. % calculate [%] of correct classifications correct = 100 * length(find(T.*Y > 0)) / length(T); fprintf('nSpread = %.2fn',spread) fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response to see how good the network learns the data figure; plot(T') ylim([-2 2]) set(gca,'ytick',[-2 0 2]) hold on grid on plot(Y','r') legend('Targets','Network response') xlabel('Sample No.') actual_spread = 8.3255 8.3255 8.3255 8.3255 8.3255 8.3255 8.3255 8.3255 8.3255 8.3255 Spread = 0.10 Num of neurons = 10 Correct class = 79.50 % Page 78 of 91
  • 79. Plot classification result % generate a grid span = -1:.025:2; [P1,P2] = meshgrid(span,span); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) % plot RBFN centers plot(net.iw{1}(:,1),net.iw{1}(:,2),'gs') Page 79 of 91
  • 80. Retrain a RBFN using Bayesian regularization backpropagation % define custom training function: Bayesian regularization backpropagation net.trainFcn='trainbr'; % perform Levenberg-Marquardt training with Bayesian regularization net = train(net,P,T); Evaluate network performance after Bayesian regularization training % check new RBFN spread spread_after_training = net.b{1} % simulate RBFN on training data Y = net(P); % calculate [%] of correct classifications correct = 100 * length(find(T.*Y > 0)) / length(T); fprintf('Num of neurons = %dn',net.layers{1}.size) fprintf('Correct class = %.2f %%n',correct) % plot targets and network response figure; plot(T') ylim([-2 2]) set(gca,'ytick',[-2 0 2]) hold on grid on plot(Y','r') legend('Targets','Network response') Page 80 of 91
  • 81. xlabel('Sample No.') spread_after_training = 2.9924 3.0201 0.7809 0.5933 2.6968 2.8934 2.2121 2.9748 2.7584 3.5739 Num of neurons = 10 Correct class = 100.00 % Plot classification result after Bayesian regularization training % simulate neural network on a grid aa = sim(net,pp); % plot classification regions based on MAX activation figure(1) ma = mesh(P1,P2,reshape(-aa,length(span),length(span))-5); mb = mesh(P1,P2,reshape( aa,length(span),length(span))-5); set(ma,'facecolor',[1 0.2 .7],'linestyle','none'); set(mb,'facecolor',[1 1.0 .5],'linestyle','none'); view(2) Page 81 of 91
  • 82. % Plot modified RBFN centers plot(net.iw{1}(:,1),net.iw{1}(:,2),'rs','linewidth',2) Published with MATLAB® 7.14 Page 82 of 91
  • 83. 1D and 2D Self Organized Map Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Define 1-dimensional and 2-dimensional SOM networks to represent the 2-dimensional input space. Contents q Define 4 clusters of input data q Create and train 1D-SOM q plot 1D-SOM results q Create and train 2D-SOM q plot 2D-SOM results Define 4 clusters of input data close all, clear all, clc, format compact % number of samples of each cluster K = 200; % offset of classes q = 1.1; % define 4 clusters of input data P = [rand(1,K)-q rand(1,K)+q rand(1,K)+q rand(1,K)-q; rand(1,K)+q rand(1,K)+q rand(1,K)-q rand(1,K)-q]; % plot clusters plot(P(1,:),P(2,:),'g.') hold on grid on Page 83 of 91
  • 84. Create and train 1D-SOM % SOM parameters dimensions = [100]; coverSteps = 100; initNeighbor = 10; topologyFcn = 'gridtop'; distanceFcn = 'linkdist'; % define net net1 = selforgmap(dimensions,coverSteps,initNeighbor,topologyFcn,distanceFcn); % train [net1,Y] = train(net1,P); plot 1D-SOM results % plot input data and SOM weight positions plotsompos(net1,P); grid on Page 84 of 91
  • 85. Create and train 2D-SOM % SOM parameters dimensions = [10 10]; coverSteps = 100; initNeighbor = 4; topologyFcn = 'hextop'; distanceFcn = 'linkdist'; % define net net2 = selforgmap(dimensions,coverSteps,initNeighbor,topologyFcn,distanceFcn); % train [net2,Y] = train(net2,P); plot 2D-SOM results % plot input data and SOM weight positions plotsompos(net2,P); grid on % plot SOM neighbor distances plotsomnd(net2) % plot for each SOM neuron the number of input vectors that it classifies figure plotsomhits(net2,P) Page 85 of 91
  • 87. Published with MATLAB® 7.14 Page 87 of 91
  • 88. PCA for industrial diagnostic of compressor connection rod defects Neural Networks course (practical examples) © 2012 Primoz Potocnik PROBLEM DESCRIPTION: Industrial production of compressors suffers from problems during the imprinting operation where a connection rod is connected with a compressor head. Irregular imprinting can cause damage or crack of the connection rod which results in damaged compressor. Such compressors should be eliminated from the production line but defects of this type are difficult to detect. The task is to detect crack and overload defects from the measurement of the imprinting force. Contents q Photos of the broken connection rod q Load and plot data q Prepare inputs by PCA q Define output coding: 0=OK, 1=Error q Create and train a multilayer perceptron q Evaluate network performance q Plot classification result Photos of the broken connection rod Load and plot data close all, clear all, clc, format compact % industrial data load data2.mat whos % show data figure plot(force(find(target==1),:)','b') % OK (class 1) grid on, hold on plot(force(find(target>1),:)','r') % NOT OK (class 2 & 3) xlabel('Time') ylabel('Force') Name Size Bytes Class Attributes force 2000x100 1600000 double notes 1x3 222 cell target 2000x1 16000 double Page 88 of 91
  • 89. Prepare inputs by PCA % 1. Standardize inputs to zero mean, variance one [pn,ps1] = mapstd(force'); % 2. Apply Principal Compoments Analysis % inputs whose contribution to total variation are less than maxfrac are removed FP.maxfrac = 0.1; % process inputs with principal component analysis [ptrans,ps2] = processpca(pn, FP); ps2 % transformed inputs force2 = ptrans'; whos force force2 % plot data in the space of first 2 PCA components figure plot(force2(:,1),force2(:,2),'.') % OK grid on, hold on plot(force2(find(target>1),1),force2(find(target>1),2),'r.') % NOT_OK xlabel('pca1') ylabel('pca2') legend('OK','NOT OK','location','nw') % % plot data in the space of first 3 PCA components % figure % plot3(force2(find(target==1),1),force2(find(target==1),2),force2(find(target==1),3),'b.') % grid on, hold on % plot3(force2(find(target>1),1),force2(find(target>1),2),force2(find(target>1),3),'r.') ps2 = name: 'processpca' xrows: 100 maxfrac: 0.1000 yrows: 2 transform: [2x100 double] no_change: 0 Name Size Bytes Class Attributes force 2000x100 1600000 double Page 89 of 91
  • 90. force2 2000x2 32000 double Define output coding: 0=OK, 1=Error % binary coding 0/1 target = double(target > 1); Create and train a multilayer perceptron % create a neural network net = feedforwardnet([6 4]); % set early stopping parameters net.divideParam.trainRatio = 0.70; % training set [%] net.divideParam.valRatio = 0.15; % validation set [%] net.divideParam.testRatio = 0.15; % test set [%] % train a neural network [net,tr,Y,E] = train(net,force2',target'); % show net view(net) Evaluate network performance % digitize network response Page 90 of 91
  • 91. threshold = 0.5; Y = double(Y > threshold)'; % find percentage of correct classifications cc = 100*length(find(Y==target))/length(target); fprintf('Correct classifications: %.1f [%%]n', cc) Correct classifications: 99.6 [%] Plot classification result figure(2) a = axis; % generate a grid, expand input space xspan = a(1)-10 : .1 : a(2)+10; yspan = a(3)-10 : .1 : a(4)+10; [P1,P2] = meshgrid(xspan,yspan); pp = [P1(:) P2(:)]'; % simualte neural network on a grid aa = sim(net,pp); aa = double(aa > threshold); % plot classification regions based on MAX activation ma = mesh(P1,P2,reshape(-aa,length(yspan),length(xspan))-4); mb = mesh(P1,P2,reshape( aa,length(yspan),length(xspan))-5); set(ma,'facecolor',[.7 1.0 1],'linestyle','none'); set(mb,'facecolor',[1 0.7 1],'linestyle','none'); view(2) Published with MATLAB® 7.14 Page 91 of 91