SlideShare a Scribd company logo
1 of 19
Download to read offline
Laboratory #6:
Neural Network Project
Kyle Villano, Tim Kindervatter and Kelvin Resto
ELC 343-01
Dr. Jesson
The College of New Jersey
December 10th, 2014
2
Table of Contents
Introduction ............................................................................................................................... 3
Problem Description................................................................................................................... 3
Materials and Procedure ............................................................................................................. 5
Results .........................................................................................................................................5
Discussion .................................................................................................................................... 8
Conclusion ................................................................................................................................... 9
Appendix…....................................................................................................................................11
3
Introduction
In this lab we used PSoC 5LP board, along with its creator software in order to create a
neural network program. This neural network needed to be able to identify types of signals that
are input to the board based on their Fourier transforms, specifically a sine and a square wave.
The network would also use a specific weighting system that aids in teaching the neural network
to learn how to identify these aforementioned input signals. In order to do this, we wrote C code
that would simulate the learning of this neural network, and then implemented it on the PSoC
5LP board. Once the code was written, we used the oscilloscope and waveform generator to
produce inputs to the system and observe the corresponding outputs, which would allow us to
finally confirm that it indeed functioned in the desired manner. After completing the design of
the code, we finished our work by observing that the neural network did indeed correctly identify
the two types of input waves. However, our learning algorithm did seem to have some trouble,
which we believe can be attributed to the incorrect assigning of weight values to certain nodes.
Problem Description
The given task in this lab was stated quite simply, that we were to create a neural network
program that was able to learn and adapt to identify two types of given input signals. The
network would need to be based mainly upon assigning weight values to various nodes, which
would then allow the program to be able to learn from its mistakes and correctly identify square
and sine input waves after many instances. The lab did give some guiding information, telling us
that the network needed 64 input nodes that were obtained by performing a 128-point FFT on the
input signal, which would then be connected to 24 points in the hidden layer that allowed for the
two output points to be formulated and produced by the board. To teach this board to perform
such a task, we needed to create both a training mode (to allow the board to learn what output
4
bits result in which answers) and a identification mode (to verify that it worked in the correct
manner). Appendix A attached to the lab sheet provided us with much relevant information
regarding how to attack this problem, such as a diagram of the training procedure and methods
that we could use to assign weight values to each node in our program. The aforementioned node
diagram describing the training procedure can be seen below in Figure 1.
Using this information, along with background knowledge and examples of neural networks
found online, we became well-equipped to try to solve this complex problem.
Figure 1: Illustration of Training Procedure
5
Materials and Procedure
Materials Used:
• PSoC Creator Software
• CY8CKIT-001 with PSoC 5 LP Board
• PSoC Programmer
• Oscilloscope and corresponding probes
• Waveform generator
Procedure:
• Perform background research on neural networks and view working examples
• Create training mode for network based on back-error propagation techniques
• Run simulation protocol to allow network to learn and determine weights for nodes
• Create identification mode for network to verify accurate functionality
• Run network and observe output results to verify success of program
Results
We first started off this lab by researching neural networks and working examples. In the
realm of computer science, artificial neural networks are essentially forms of computer
architecture inspired by biological neural networks, such as the brain, and are used to estimate
or approximate functions that depend on a large number of inputs that are generally unknown.
An artificial neural network is made up of systems of interconnected neurons that can compute
values from inputs, and are basically capable of machine learning as well as pattern recognition
due to their adaptive nature.
The next part of the lab involved creating a trading mode for the neural network using
back-error propagation techniques. By giving the network a set of input, it is trained to give a
6
desired response. If the network gives the wrong answer, then the network is corrected by
adjusting its parameters so that the error is reduced. During this correction process, one starts
with the output nodes and propagation is backward to the input nodes and is repeated until all the
weight values converge. For this lab the weight values were calculated by using a C++ program
since a PSoC board does not have the capacity to calculate the weights. Even though the weights
were calculated on a computer the program took extremely long to run (approximately 4 hours!).
The final part of the Lab involved taking a 128-point FFT of a signal and feeding the
generated points into the neural network. The PSoC board, programed with the neural network
(Figure 2), had to be able to recognize two distinct input signals scaled between 0 and 1. The
first signal was a sinusoidal input with a value of +1 for node 0 and -1 for node 1 in the output
layer. The other signal was a square wave input with a value of -1 for node 0 and +1 for node 1
in the output layer. When a sine signal was generated from a waveform generator at 1 volt peak
to peak at 100 Hz the PSoC board recognized the input signal through its FFT inputs which were
fed through the neural network and displayed ‘Sine’ on the LCD screen (Figure 3). Also when a
square wave signal was fed to the PSoC board the LCD screen displayed ‘Square’ (Figure 4).
Thus our neural network was a success and the board was able to recognize both input signals.
7
Figure 2: Neural network block diagram with ADC converter and LCD
Figure 3: Neural network recognizing sine wave
8
Figure 4: Neural network recognizing square wave
Discussion
In this project, we were instructed to implement a neural network that is able to recognize
a sine wave and a square wave, both of unity amplitude. This is accomplished by taking a 128-
point FFT of the signal and feeding the generated points into the neural network. This neural
network uses a three-layer structure, which means that there is a set of nodes that accepts the
FFT points as input, a middle hidden layer with a transfer function that alters the output of the
first layer, and finally an output layer that forms the waveform that is desired.
The network is told from the outset to expect a certain output from the final layer of
nodes given a certain input into the input layer. In our case, we want the output to be the
waveform we wish to generate, and we wish the input to be the FFT of that waveform. The
learning algorithm uses values called weights to calculate the error in the output nodes (in other
words, how close it got to the expected output), and then uses that error value to adjust the
9
weights in an attempt to get closer to the desired output. Each node in the middle and output
layers has certain weights associated with the node that it is receiving an input from. We
assigned arbitrary weights at first to give the algorithm a starting place. Subsequently, using
formulas given to us, we implemented an algorithm in C++ that calculates the error in the output,
and then uses that error value to adjust the values of the weights. As a result, every time the
learning algorithm is run, it gets closer and closer to the correct output value until all of the error
values converge to 0, which means we have the exact output we desire.
Our neural net was able to give a reasonable output for the square wave, but was unable
to reproduce a sine wave perfectly. In order to fix this, we would need to run the learning
algorithm at least one more time to recalculate the weights. Unfortunately, this process takes
many hours and we did not have the time to do this. If we had the time, the algorithm would be
able to adjust the weights to give an adequate output.
Conclusion
In this lab, we learned how to work with and write very high-level C code, and how one
can simulate and verify these results using the PSoC 5LP board. We also learned how to design a
functioning neural network, and how to effectively code and implement such a program in
regards to solving microcomputer projects. We also were able to ascertain information regarding
various errors that could occur during the coding process, and how to identify and correct them
with the software, hardware, and various debugging methods.
We were able to incorporate knowledge and code used in many examples that we
researched online in completing our project. These newly learned skills (along with knowledge
retained from previous projects) then allowed us to code and analyze the two routines that we
needed to successfully implement in order to create this neural network. We were also able to
understand the purpose of the many functions and variables utilized in our network programs,
10
but perhaps more importantly, we were able to identify the ways in which we needed to alter
these components to fit the necessary tasks for our overall project.
In performing the project, we were initially overwhelmed with the scope of what we were
tasked with doing. However, through much background research and the studying of examples,
we learned the various concepts and ideas that a neural network contained, and how to
implement these concepts into our C code. We did come across a good amount of trouble when
developing the learning algorithm, as the weights assigned to various nodes did not seem to be
correctly allowing our network to learn to identify signals, among other things. We were also
surprised by the amount of time that the learning routine needed to run in order to be able to
correctly identify these input signals, as our program ran routines for several hours before it
could be used to identify a square and sine wave.
Overall, the neural network is a program with wide-reaching applications in the field of
microcomputer systems. For example, neural networks can be used in items ranging from home
appliances to entire power grids, as the learning algorithms will self-adjust to allocate power
more efficiently based upon observations of usage and intensity. Therefore, it is quite apparent
that the knowledge and experiences gained from successfully completing this project will also
prove to be useful in the future. This lab is also important in that it helps us understand how
neural networks are implemented in various programs, as well as the many applications that this
type of network can be used for in order to further our knowledge in this field and help us solve
future electrical and computer engineering projects.
11
Appendix: Code for Neural Network
#include <device.h>
#include <math.h>
// Creates the complex structure that will be used throughout the code. There
// is one real part and one imaginary part.
struct cmpx{double real; double imag;};
// Initializes COMPLEX to complex structure type
typedef struct cmpx COMPLEX;
// This function performs the FFT of an input sample array
void FFT(COMPLEX *Y, COMPLEX *w ) // input sample array
{
COMPLEX temp1, temp2; //Temporary storage variables
int i, j, k; //For loop counter variables
int upper_leg, lower_leg, leg_diff; //Index of upper/lower butterfly
leg and difference between upper/lower leg
int index, step; //Index and step between twiddle factors
leg_diff = 64; // starting difference between upper and lower butterfly
// leg
step = 1; // Step between values in twiddle factors array
// for 128-point FFT
// Fills up array
for(i=0; i<7; i++)
{
index = 0;
for(j=0; j<leg_diff; j++)
{
for(upper_leg=j; upper_leg<128; upper_leg+=2*leg_diff)
{
lower_leg = upper_leg + leg_diff;
temp1.real = Y[upper_leg].real + Y[lower_leg].real;
temp1.imag = Y[upper_leg].imag + Y[lower_leg].imag;
temp2.real = Y[upper_leg].real - Y[lower_leg].real;
temp2.imag = Y[upper_leg].imag - Y[lower_leg].imag;
Y[lower_leg].real = temp2.real * w[index].real -
temp2.imag * w[index].imag;
Y[lower_leg].imag = temp2.real * w[index].imag +
temp2.imag * w[index].real;
Y[upper_leg].real = temp1.real;
Y[upper_leg].imag = temp1.imag;
}
index += step;
}
leg_diff = leg_diff/2;
step *= 2;
}
// bit reversal for re-sequencing data
j = 0;
for(i=1; i<127; i++)
12
{
k = 64;
while(k<=j)
{
j = j-k;
k = k / 2;
}
j = j + k;
if(i<j)
{
temp1.real = Y[j].real;
temp1.imag = Y[j].imag;
Y[j].real = Y[i].real;
Y[j].imag = Y[i].imag;
Y[i].real = temp1.real;
Y[i].imag = temp1.imag;
}
}
return;
}
//end of FFT function
void main()
{
/*________ Initialize weights calculated via C++ ji_________*/
double wji[24][64] = {{0.1781, 1.32975, 3.6107, 2.68523, 2.12523, 1.59975,
1.44975, 0.487195, 1.19156, 0.312466, 0.812014, 0.497195, 0.750656, 1.04403,
2.84523, 2.9307, 2.31523, 0.637195, 1.09247, 2.3395, 7.8526, 0.874027,
1.10156, 0.931109, 1.05403, 0.173824, 0.935385, 1.06247, 1.50493, 0.999751,
1.11403, 1.07975, 0.83629, 1.8195, 1.74403, 1.54403, 1.52403, 6.18568,
9.77758, 4.10473, 2.2995, 2.11805, 1.20584, 1.03686, 1.35876, 0.129482,
0.749685, 0.888304, 1.65661, 1.65088, 0.688238, 2.21919, -0.648999, 1.23195,
0.321188, 8.02169, -0.0458608, 1.86566, 0.376475, 2.27742, 1.09371, 1.75371,
0.538238, 0.1405},
{0.595544, 1.5929, 3.96685, 2.86988, 2.33988, 1.6229, 1.0629, 0.552939,
0.658109, 1.13071, 0.554411, 0.882939, 1.3755, 1.55882, 2.35988, 4.39685,
2.36988, 0.792939, 0.470714, 2.6458, 8.62476, 1.62882, 0.808109, 1.34181,
1.67882, 0.679621, 0.56773, 1.28071, 1.16143, 1.6929, 1.51882, 1.4329,
0.420334, 2.2058, 1.80882, 1.07882, 1.50882, 7.78963, 12.5675, 5.13567,
2.8258, 2.58764, 0.974032, 0.616592, 1.08105, -0.52484, -0.0656388, 1.25475,
1.80949, 1.48541, -0.513793, 2.43016, -2.37456, 0.359391, -2.81896, -4.29961,
-5.38413, 0.912575, -0.387585, 1.85637, 1.51318, 1.11318, 0.336207, -
1.19228},
{0.41051, 4.36401, 11.2477, 7.43585, 8.04585, 4.44401, 4.38401, 1.12288,
2.76927, 1.7269, 2.40308, 0.912879, 2.86164, 4.18617, 8.10585, 11.5677,
7.48585, 1.54288, 1.3469, 7.28801, 26.3351, 3.54617, 2.13927, 2.68545,
3.76617, 1.04835, 2.98762, 1.6269, 2.8838, 4.29401, 3.65617, 4.43401,
2.02525, 7.62801, 3.33617, 3.66617, 3.80617, 22.7576, 37.7649, 14.5939,
7.45801, 7.17234, 2.35143, 3.00266, 3.21215, -0.978675, 1.76915, 3.80833,
6.33667, 5.85883, 1.28347, 9.32716, -0.984896, 6.33879, 6.81737, 66.7304,
10.8013, 11.8241, 2.67695, 10.4206, 5.48532, 5.65532, 1.63347, -0.497448},
{-0.0241987, 2.00168, 5.52173, 4.0367, 3.6867, 1.99168, 1.63168, 0.920807,
0.991666, 1.17666, 1.04916, 0.810807, 1.30667, 1.69833, 3.7967, 5.45173,
3.8167, 0.690807, 0.56666, 2.95335, 12.7018, 1.22833, 1.71167, 0.964168,
1.96833, 0.139152, 1.22082, 0.73666, 1.77332, 2.43168, 1.68833, 1.68168,
0.825812, 3.01335, 1.24833, 2.14833, 1.78833, 9.76011, 17.3402, 6.89006,
13
3.82335, 2.70665, 1.05831, 0.498274, 1.52247, -1.46095, 0.268222, 1.29498,
2.73995, 2.3466, -0.47848, 3.31157, -2.66199, 0.348068, -3.3624, -5.89371, -
6.82632, 1.22462, -0.91696, 2.18309, 1.05655, 1.07655, -0.27848, -1.07099},
{0.0293048, 1.08133, 2.41171, 1.97152, 1.91152, 0.901332, 1.05133, 0.503343,
0.883256, 0.169219, 0.806237, 0.283343, 0.567294, 0.562475, 1.85152, 2.82171,
2.02152, 0.903343, 0.749219, 1.55266, 4.77247, 0.682475, 1.15326, 0.970275,
0.872475, 0.498162, 1.19142, 0.709219, 0.988437, 1.03133, 0.582475, 1.47133,
0.10738, 1.76266, 1.08247, 1.11247, 0.432475, 3.82456, 6.58532, 2.91418,
1.29266, 1.66495, 1.0744, 0.365904, 1.1966, -0.26498, -0.0806674, 1.07362,
1.08724, 1.29838, -0.368382, 1.422, -1.76695, -0.436574, -2.76191, -10.0946,
-4.88543, -0.224767, -1.10676, 0.503615, 0.831807, 0.321807, -0.308382, -
0.173476},
{0.476395, 3.02602, 7.34247, 5.40425, 4.60425, 2.76602, 2.59602, 0.338041,
1.64272, 1.09108, 1.5019, 0.548041, 1.60437, 2.6038, 4.76425, 7.39247,
5.21425, 0.558041, 0.831078, 4.97203, 17.1454, 2.2338, 1.50272, 1.66355,
2.5138, 0.418609, 1.68133, 1.18108, 2.02216, 2.06602, 2.0638, 2.77602,
1.44969, 4.93203, 2.3638, 1.7838, 2.2338, 13.6027, 23.4157, 8.63628, 4.77203,
3.4376, 1.32051, 1.30716, 2.01076, -1.37468, -0.219482, 2.00159, 3.10318,
2.73096, -0.60391, 3.72255, -3.73605, 1.08041, -5.22955, -7.68152, -9.59519,
1.49473, -1.93782, 2.81864, 2.28432, 1.91432, -0.16391, -1.36803},
{0.751677, -0.966573, -5.1527, -2.44964, -2.59964, -1.16657, -0.586573,
0.399064, -0.421347, 0.0612656, -0.570041, 0.269064, -0.80396, -1.04008, -
2.42964, -4.4227, -2.75964, 0.409064, 0.301266, -2.59315, -12.095, -1.20008,
-0.611347, -0.252654, -1.19008, 0.555185, 0.213838, -0.358734, -0.827469, -
0.656573, -1.26008, -1.03657, 0.0764507, -3.07315, -0.850082, -1.05008, -
0.880082, -9.70891, -16.6912, -5.67278, -2.18315, -1.86016, -0.464856, -
0.210607, 0.107716, 1.64413, 0.298868, -0.45359, -1.62718, -1.01069,
0.841851, -2.10428, 4.20677, -0.00936273, 4.27926, 3.9924, 7.53666, -
0.880577, 1.5937, -1.27243, -0.691214, -0.941214, 1.11185, 2.48838},
{-0.230998, 2.45367, 5.67823, 4.17595, 3.81595, 2.40367, 2.02367, 0.121459,
1.33876, 0.9463, 1.09253, 0.441459, 1.76121, 1.83506, 4.62595, 6.50823,
3.89595, 0.0714588, 1.1863, 3.49734, 13.7974, 1.59506, 1.01876, 1.18498,
2.05506, 0.117615, 1.74637, 0.5363, 1.2926, 1.96367, 2.22506, 2.11367,
0.473915, 4.26734, 2.36506, 1.62506, 2.01506, 11.2179, 19.087, 7.06329,
4.10734, 3.49011, 1.08014, 0.739218, 1.06022, -1.35406, -0.216621, 1.48644,
2.71289, 2.69427, -0.413847, 3.84072, -2.81998, 0.754588, -3.67923, -6.64332,
-7.49462, 2.02302, -1.17769, 2.20687, 1.19844, 1.55844, -0.383847, -1.68499},
{0.198232, 1.97458, 5.1803, 3.15244, 3.93244, 2.20458, 1.57458, 0.677804,
1.25544, 1.06587, 1.41065, 0.697804, 1.17501, 2.14131, 4.07244, 5.3003,
3.78244, 0.587804, 0.835867, 3.55917, 11.4417, 1.55131, 1.09544, 1.88023,
1.42131, 0.911509, 0.796949, 0.755867, 1.37173, 2.01458, 1.74131, 2.28458,
0.507377, 3.79917, 2.13131, 2.13131, 1.90131, 10.2773, 16.3188, 6.09161,
3.45917, 2.70261, 1.26216, 1.25148, 0.963244, -0.578152, 0.201646, 1.77803,
2.21606, 2.41278, 0.0650921, 3.18081, -1.46768, 1.71804, -1.35454, 8.02823, -
2.00417, 2.531, -0.409816, 3.70591, 1.99295, 2.00295, 0.525092, -0.438838},
{0.21548, 1.57082, 3.60625, 2.12353, 2.61353, 1.83082, 1.11082, 0.094023,
0.573733, 1.08519, 1.06446, 0.064023, 1.12228, 1.49892, 2.84353, 3.59625,
2.27353, 0.124023, 0.89519, 1.79164, 8.26711, 1.40892, 1.35373, 0.923005,
1.46892, -0.0226243, 0.691109, 0.63519, 1.14038, 1.05082, 0.748923, 1.08082,
0.952566, 2.35164, 1.00892, 1.13892, 1.18892, 6.4606, 10.9215, 4.35517,
2.15164, 1.67785, 1.01184, 0.873236, 0.517756, -0.743999, -0.252451,
0.897028, 1.73406, 0.72216, -0.576243, 1.78919, -2.4152, -0.59977, -4.14121,
-17.3341, -8.82618, -0.543298, -1.13249, 0.212945, 0.556472, 0.0764724, -
0.756243, -1.2526},
{0.363154, 2.2314, 6.0747, 3.57805, 3.60805, 2.0114, 2.3514, 0.650484,
1.60673, 1.0094, 1.05307, 1.01048, 1.37406, 2.11614, 3.74805, 6.2047,
4.14805, 0.870484, 1.0594, 3.68279, 12.7213, 1.94614, 1.83673, 1.5304,
14
2.26614, 0.828409, 1.41514, 0.849405, 1.91881, 2.4014, 2.04614, 2.0314,
0.877814, 3.62279, 1.92614, 1.90614, 2.12614, 10.6341, 18.7207, 7.12084,
3.92279, 3.15228, 0.921479, 1.60037, 1.96722, -0.703124, 0.884606, 1.82088,
2.67177, 2.44651, 0.854095, 3.7774, -1.43846, 2.48484, 0.0804741, 13.8304, -
0.323147, 3.50559, 0.49819, 3.98149, 2.65075, 2.03075, 0.814095, -0.764231},
{1.04803, -1.52188, -5.64641, -3.83914, -3.57914, -1.27188, -1.26188,
0.172579, -0.130978, 0.264474, 0.141748, 0.762579, -1.16643, -1.0465, -
3.78914, -5.90641, -3.47914, 0.262579, 0.104474, -3.55377, -15.0855, -1.2565,
-0.330978, -0.363704, -1.1065, 0.492653, -0.518326, -0.175526, -0.451052, -
1.59188, -0.936504, -1.59188, 0.217127, -3.46377, -0.986504, -1.3265, -
1.5865, -12.1474, -20.8565, -7.09291, -3.71377, -2.80301, -0.0555994, -
0.250368, -0.148399, 1.99493, 1.12577, -0.701125, -2.26225, -1.73687,
1.74653, -3.068, 4.90031, 0.145791, 4.96263, 5.26085, 9.22874, -0.944945,
2.54305, -2.00147, -1.25074, -0.970735, 1.83653, 2.78016},
{0.400277, 1.35833, 3.89819, 2.53326, 2.63326, 1.29833, 1.36833, 0.787264,
1.26435, 0.407365, 0.560859, 0.667264, 0.801339, 0.721718, 2.19326, 3.34819,
2.43326, 0.0472637, 1.11737, 2.06665, 7.65793, 0.881718, 1.09435, 1.03785,
1.58172, 0.246885, 0.451237, 1.16737, 1.08473, 1.82833, 1.34172, 1.14833,
0.61425, 2.36665, 1.60172, 0.881718, 0.761718, 6.09978, 9.85952, 4.48991,
2.31665, 1.59344, 0.887744, 0.871893, 0.701616, -0.282764, -0.257932,
1.34511, 0.980218, 0.83361, -0.611149, 1.86872, -2.43723, 0.222637, -4.02574,
-12.9666, -7.32034, -0.223578, -1.5623, 1.19757, 0.363785, 0.983785, -
0.761149, -0.718616},
{0.279085, 2.14787, 4.59434, 2.9761, 2.8961, 2.04787, 1.56787, 0.148732,
0.798572, 0.428925, 1.28875, 0.568732, 1.85822, 1.9675, 2.7661, 4.74434,
2.4661, 0.638732, 0.978925, 2.57573, 9.32728, 1.7375, 1.05857, 1.3084,
1.9075, 0.479454, 0.938026, 0.428925, 1.02785, 1.50787, 1.6875, 1.82787,
0.708379, 2.83573, 1.1975, 1.7675, 1.4275, 7.94831, 13.3412, 5.31183,
2.94573, 2.57499, 0.508203, 1.16639, 1.2473, -0.542763, 0.73528, 1.37713,
2.58425, 1.77389, 0.624541, 2.69101, -0.659153, 1.21732, 0.182706, 9.57673, -
0.32913, 3.27009, 0.579082, 3.24555, 1.53278, 1.63278, 0.414541, -0.0745767},
{0.801652, 2.05456, 4.80349, 3.21903, 3.12903, 1.54456, 2.24456, 1.00454,
1.46878, 1.31589, 1.03733, 0.344544, 1.23167, 2.19467, 3.53903, 5.37349,
3.00903, 0.554544, 0.615887, 3.67913, 10.9513, 2.07467, 1.45878, 1.01023,
2.06467, 0.471549, 1.72033, 0.645887, 1.81177, 1.82456, 2.05467, 2.20456,
0.607436, 2.98913, 1.37467, 1.58467, 1.64467, 8.65708, 14.9249, 5.72815,
3.59913, 3.07933, 0.918882, 0.794976, 1.52332, 0.0621664, 0.325284, 1.77477,
2.46954, 2.13964, 0.68549, 3.86441, -0.493481, 1.89544, 0.987451, 14.8684,
0.949411, 3.92539, 0.0409802, 3.7499, 1.83995, 2.35995, 0.52549, -0.31174},
{-0.203802, 1.85697, 4.77407, 3.05052, 3.26052, 1.21697, 1.76697, 0.544908,
1.29955, 1.21084, 1.0502, 0.264908, 1.71826, 1.54039, 3.55052, 4.12407,
3.27052, 0.644908, 1.29084, 3.03394, 10.5283, 1.84039, 1.10955, 1.53891,
1.67039, 0.412776, 1.00233, 0.360841, 0.991683, 2.13697, 1.41039, 1.53697,
0.883618, 2.58394, 1.79039, 1.24039, 1.21039, 8.12156, 14.1058, 5.82446,
2.61394, 2.31079, 0.562973, 0.410657, 1.51446, -1.2167, -0.0790792, 1.32381,
2.26763, 1.46105, -0.462236, 2.79486, -1.78802, 0.729077, -2.93118, -4.95844,
-5.90013, 1.61039, -0.714473, 1.87263, 1.51131, 0.781313, 0.0877636, -
0.999011},
{-0.30755, 1.82211, 5.16712, 3.37461, 3.82461, 1.69211, 1.60211, -0.081049,
1.3691, 0.732602, 0.645853, 0.398951, 1.4656, 1.27171, 3.07461, 4.77712,
3.36461, 0.138951, 1.1826, 2.95421, 11.1271, 1.23171, 0.679103, 0.962354,
1.05171, 0.46285, 1.02195, 1.2226, 1.2852, 1.61211, 1.19171, 2.21211,
0.985452, 3.44421, 1.30171, 1.70171, 1.22171, 9.42384, 16.1739, 5.50882,
2.69421, 1.93341, 0.918703, 0.490504, 0.828054, -1.53955, -0.380697, 1.4413,
1.28261, 1.50221, -1.6615, 1.50351, -5.5155, -1.97049, -9.47749, -38.3379, -
17.7335, -1.69948, -3.833, 0.0220166, -0.0389917, 0.0210083, -1.3515, -
2.61775},
15
{0.144587, 1.05116, 4.31508, 2.75312, 2.92312, 1.91116, 1.64116, 0.94698,
1.54637, 0.583981, 0.970177, 0.51698, 0.858766, 1.28035, 2.49312, 3.58508,
2.31312, 0.69698, 0.743981, 2.22232, 8.91294, 1.69035, 0.886373, 1.57257,
1.64035, 0.385392, 1.13176, 0.363981, 1.43796, 1.22116, 1.08035, 1.06116,
0.509372, 2.20232, 1.48035, 1.80035, 1.88035, 6.74936, 11.9572, 5.21544,
2.43232, 2.03071, 0.775569, 0.94794, 1.23335, -0.129436, 0.775526, 1.44955,
1.7691, 2.44829, 0.713917, 2.33784, -0.534129, 1.1498, 0.699585, 9.66343,
0.0452529, 2.66568, 0.347834, 2.44176, 1.74588, 1.71588, 0.153917, 0.162935},
{0.284495, 2.54727, 6.65784, 4.55256, 4.76256, 2.15727, 2.35727, 0.235552,
1.49516, 1.3841, 1.32463, 0.0855521, 1.83621, 1.63925, 4.59256, 6.60784,
4.00256, 0.475552, 1.1841, 4.41454, 14.959, 2.05925, 1.20516, 2.12568,
1.53925, 0.582512, 1.03767, 1.1841, 1.3282, 2.16727, 1.93925, 2.73727,
0.546609, 3.67454, 2.05925, 2.44925, 1.89925, 11.8177, 20.3088, 8.4471,
4.55454, 3.71851, 1.63714, 0.965202, 1.85071, -1.25559, 0.631151, 2.15124,
2.51247, 2.70445, -0.184883, 3.88569, -3.83505, 0.195521, -4.26442, -7.13426,
-8.07395, 2.04592, -1.06977, 3.21081, 1.2704, 1.4404, -0.094883, -1.80753},
{0.253328, 1.40887, 3.36352, 2.85119, 2.62119, 1.31887, 1.69887, 0.563793,
0.617942, 1.01748, 1.29771, 1.01379, 1.14841, 1.69542, 2.98119, 3.69352,
2.57119, 0.763793, 0.457477, 2.63774, 8.28281, 1.59542, 1.44794, 1.01817,
1.34542, 0.76678, 0.834722, 0.917477, 0.724954, 1.16887, 1.66542, 1.42887,
0.914257, 2.93774, 1.08542, 0.985419, 1.15542, 7.16358, 11.2329, 4.38894,
2.57774, 2.16084, 0.69449, 1.46506, 0.921734, -0.583933, 0.604706, 1.49197,
2.30393, 1.96048, 0.317801, 2.58245, -1.01672, 1.17793, 0.199006, 7.16155, -
1.06979, 1.97805, 0.145603, 2.13025, 1.32012, 1.19012, 0.497801, 0.24664},
{0.0594374, 1.91498, 3.54606, 2.77052, 3.03052, 1.70498, 1.90498, 0.480546,
0.662766, 0.471658, 0.812212, 0.450546, 1.09387, 1.75442, 2.38052, 3.56606,
2.69052, 0.630546, 0.931658, 2.90996, 9.16822, 1.39442, 1.13277, 1.68332,
1.55442, 0.929996, 0.752762, 1.27166, 0.873316, 1.84498, 1.79442, 1.38498,
0.951654, 3.03996, 1.73442, 1.29442, 1.45442, 7.45157, 12.3637, 4.98049,
2.82996, 2.91885, 1.32221, 1.50275, 1.48331, 0.206646, 1.20107, 1.00387,
2.10773, 2.07717, 0.869957, 2.68104, -0.285626, 1.60546, 0.239787, 11.3587,
0.329616, 2.77095, 0.409915, 2.861, 1.3255, 1.4455, 0.749957, 0.0871872},
{-0.214275, 1.9441, 5.81564, 3.70487, 3.21487, 2.0841, 1.5441, 0.72388,
1.37779, 1.33963, 1.20871, 0.39388, 1.94594, 1.74742, 3.66487, 5.07564,
3.20487, 0.42388, 0.939633, 3.86819, 12.3887, 1.91742, 1.23779, 1.05687,
1.28742, 0.772401, 1.25019, 0.609633, 0.949266, 1.7341, 2.03742, 1.6141,
1.14203, 3.72819, 1.93742, 2.03742, 1.67742, 10.2146, 16.7277, 6.12307,
3.60819, 2.54484, 0.991112, 1.00739, 1.46167, -0.567656, 0.507364, 1.69074,
2.28149, 1.95481, -0.0759887, 2.79556, -3.23275, 0.288796, -3.57994, -
5.64473, -6.9739, 1.88358, -1.31198, 1.91957, 1.02478, 1.00478, -0.515989, -
0.731375},
{0.560464, 2.54543, 5.98058, 3.803, 4.203, 1.82543, 2.04543, 0.489979,
1.2164, 1.37688, 0.681641, 0.0699794, 2.09591, 2.11328, 4.193, 5.54058,
3.523, 0.479979, 0.916884, 3.25086, 13.2709, 2.02328, 0.996399, 1.63116,
1.84328, 0.302611, 1.44901, 1.10688, 1.38377, 2.50543, 1.81328, 1.92543,
1.27949, 3.37086, 2.03328, 1.96328, 1.75328, 10.449, 17.7993, 7.39386,
3.23086, 2.75656, 1.41425, 1.44684, 1.79638, -1.10027, -0.0995976, 1.06114,
2.15227, 2.45012, -0.773891, 3.49126, -3.23536, 0.329794, -4.09945, -6.07687,
-7.26502, 2.01348, -1.21778, 1.99737, 1.84368, 1.01368, -0.473891, -1.04768},
{0.0262033, 0.773105, 2.69417, 2.13364, 1.75364, 1.18311, 1.12311, 0.298309,
0.998893, 0.486787, 0.54784, 0.548309, 1.421, 1.03568, 2.07364, 3.26417,
1.63364, 0.648309, 0.806787, 1.83621, 6.44629, 1.40568, 0.748893, 0.939946,
1.34568, 0.843628, 0.822521, 0.856787, 1.30357, 1.60311, 0.94568, 0.833105,
0.160415, 2.13621, 1.33568, 0.81568, 0.85568, 4.70091, 8.29303, 3.78985,
1.90621, 1.94136, 0.761468, 0.553406, 0.697203, -0.770921, 0.0711313,
0.928256, 1.60651, 0.639086, -0.603718, 1.72734, -1.19797, -0.0569066, -
16
3.03859, -9.87563, -5.60347, 0.449905, -0.887437, 0.593624, 0.886812,
1.00681, -0.203718, -0.308984}};
/*_________End Weights ji______________________*/
/*___________Begin Weights kj__________________*/
double wkj[2][24]={{0.0644177, -1.66273, -3.62619, -10.4896, 0.16428, -
12.4576, 8.8171, -12.322, -0.289505, -0.392423, -0.490231, 14.4188, -
0.747429, 0.102296, 0.187079, -2.74119, -3.31169, -0.201028, -18.5548,
0.162079, -0.578729, -8.16069, -11.4766, -0.315152},
{0.558525, 2.07238, 3.5664, 11.1896, 1.20781, 12.7963, -7.48796, 12.2755,
0.781586, 1.76665, 0.98428, -12.8118, 1.69934, 1.06168, 0.817385, 3.44866,
3.45631, 0.487867, 18.3783, 0.579047, 0.610616, 9.05575, 11.7729, 1.15769}};
/*_____________End Weights kj___________*/
ADC_Start();// Begin analog to digital converter
ADC_StartConvert();// Begin analog to digital conversion
LCD_Char_Start(); //Start LCD display
LCD_Char_ClearDisplay();// Clear LCD display
LCD_Char_Position(0,0); // Set LCD character position to point 0,0
double magnitude[64];
COMPLEX wave1[128];
double xi[64];//values in nodes in layer 1 (switches for signal 1/2)
double xj[24];//values in nodes in layer 2
double xk[2];//values in nodes in layer 3
double sj[64];//intermediate sj
double sk[24];//intermediate sk
int i;//nodes in layer 1
int j;//nodes in layer 2
int k;//nodes in layer 3
int a;//variable for magnitude
int b = 0;//switch reading variable
int c = 0;//switch reading variable
int d = 0;//switch reading variable
double sumj = 0;//actual values of sj
double sumk = 0;//actual values of sk
// GENERATES TWIDDLE CONSTANTS
// 64 complex points
double arg;
COMPLEX z[64];
arg = 2 * 3.141592654 / 128;
for(i=0; i<64; i++)
{
z[i].real = cos(i * arg);
z[i].imag = -sin(i * arg);
}
//end initialization
17
for(;;)
{
/*______LCD Message_____________*/
LCD_Char_PrintString("Press to Start");// Displayed on LCD on start up
while(d == 0)
{
c = Pin_1_Read();
b = Pin_2_Read();
if (c != b)
{
d = 1;
}
}
d=0;
while(d == 0)
{
c = Pin_1_Read();
b = Pin_2_Read();
if (c == b)
{
d = 1;
}
}
LCD_Char_ClearDisplay();
d = 0;
wave1[0].real = 1.0*ADC_GetResult16();
CyDelay(1);
/*______________Wave 1_____________*/
for(i = 0; i < 128; i++)
{
wave1[i].real = 1.0*ADC_GetResult16();
if(wave1[i].real < -100)
{
wave1[i].real =1.0- (-wave1[i].real)/100.0;
}
else if(wave1[i].real > -100)
{
wave1[i].real =1.0- (wave1[i].real)/100.0;
}
CyDelayUs(15);
wave1[i].imag = 0;
}
/*_________End Wave 1_______________*/
CyDelay(10);
/*_________FFT Input Signal**********/
FFT(wave1, z);
for(a = 64; a <128; a++)//calculate the magnitude with 1/2 the 128 points
{
magnitude[(-64 + a)] =
sqrt(wave1[a].real*wave1[a].real+wave1[a].imag*wave1[a].imag);
}
for(i = 0; i <64; i++)
{
xi[i] = magnitude[i];
18
}
/*___________________End FFT_____________*/
CyDelay(10);
/*________________Calculate Outputs____________*/
for (j = 0; j <24; j++)
{
sumj = 0;
for (i = 0; i < 64; i ++)
{
sumj = sumj + wji[j][i]*xi[i];//finding the sum of wji*xi
}
sj[j] = sumj/64.0;//putting the sum sj into its array dividing by
64
xj[j] = tanh(sumj/64.0);//xj = the tanh of sj
}
for (k = 0; k < 2; k++)
{
sumk = 0;
for (j = 0; j < 24; j++)
{
sumk = sumk + wkj[k][j]*xj[j];//find the sum of wkj*xj
}
sk[k] = sumk/24.0;//putting the sum sk into its array dividing by 24
xk[k] = tanh(sk[k]/24.0);//xk = the tanh of sk
}
for(k = 0; k <2; k++)
{
if (xk[k] < 0)
{
xk[k] = -1;//rounding function down
}
else if(xk[k] > 0)
{
xk[k] = 1;//rounding function up
}
}
/*__________End Calculations*___________*/
CyDelay(10);
/*__________Output waveform__________*/
if((xk[0] == 1)&&(xk[1] == -1))
{
LCD_Char_ClearDisplay();
LCD_Char_PrintString("Sine");// If sine wave is inputted via wave form
//generator than LCD will display Sine
}
else if ((xk[0] == -1)&&(xk[1] == 1))
{
LCD_Char_ClearDisplay();
LCD_Char_PrintString("Square");// If square wave is inputted via wave form
//generator than LCD will display Square
}
19
}
/*_______________End Output______________*/
/*___________LCD Message____________*/
while(d == 0){
c = Pin_1_Read();// Read pin 1, store value in c
b = Pin_2_Read();// Read pin 2, store value in b
if (c != b){
d = 1;
}
}
d=0;
while(d == 0){
c = Pin_1_Read();// Read pin 1, store value in c
b = Pin_2_Read();// Read pin 2, store value in b
if (c == b){
d = 1;
}
}
LCD_Char_ClearDisplay();// Clear LCD screen
d = 0;
/***********End LCD Message**********/
}
return;
}

More Related Content

What's hot

Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...cscpconf
 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationSung-ju Kim
 
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Simplilearn
 
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...Simplilearn
 
Fundamental of deep learning
Fundamental of deep learningFundamental of deep learning
Fundamental of deep learningStanley Wang
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkDEEPASHRI HK
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Randa Elanwar
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksNelson Piedra
 
What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)Randa Elanwar
 
Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Adam Blevins
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksThe Integral Worm
 
Deep Style: Using Variational Auto-encoders for Image Generation
Deep Style: Using Variational Auto-encoders for Image GenerationDeep Style: Using Variational Auto-encoders for Image Generation
Deep Style: Using Variational Auto-encoders for Image GenerationTJ Torres
 
Secured transmission through multi layer perceptron in wireless communication...
Secured transmission through multi layer perceptron in wireless communication...Secured transmission through multi layer perceptron in wireless communication...
Secured transmission through multi layer perceptron in wireless communication...ijmnct
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkDessy Amirudin
 
A temporal classifier system using spiking neural networks
A temporal classifier system using spiking neural networksA temporal classifier system using spiking neural networks
A temporal classifier system using spiking neural networksDaniele Loiacono
 
Deep learning algorithms
Deep learning algorithmsDeep learning algorithms
Deep learning algorithmsRevanth Kumar
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Simplilearn
 
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...iosrjce
 

What's hot (20)

Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
Economic Load Dispatch (ELD), Economic Emission Dispatch (EED), Combined Econ...
 
Multi Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back PropagationMulti Layer Perceptron & Back Propagation
Multi Layer Perceptron & Back Propagation
 
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
Deep Learning Tutorial | Deep Learning Tutorial For Beginners | What Is Deep ...
 
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...
Deep Learning With Python | Deep Learning And Neural Networks | Deep Learning...
 
Fundamental of deep learning
Fundamental of deep learningFundamental of deep learning
Fundamental of deep learning
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9Introduction to Neural networks (under graduate course) Lecture 9 of 9
Introduction to Neural networks (under graduate course) Lecture 9 of 9
 
Fundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural NetworksFundamental, An Introduction to Neural Networks
Fundamental, An Introduction to Neural Networks
 
What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)What is pattern recognition (lecture 6 of 6)
What is pattern recognition (lecture 6 of 6)
 
Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)Neural Networks on Steroids (Poster)
Neural Networks on Steroids (Poster)
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural Networks
 
Deep Style: Using Variational Auto-encoders for Image Generation
Deep Style: Using Variational Auto-encoders for Image GenerationDeep Style: Using Variational Auto-encoders for Image Generation
Deep Style: Using Variational Auto-encoders for Image Generation
 
Secured transmission through multi layer perceptron in wireless communication...
Secured transmission through multi layer perceptron in wireless communication...Secured transmission through multi layer perceptron in wireless communication...
Secured transmission through multi layer perceptron in wireless communication...
 
Som paper1.doc
Som paper1.docSom paper1.doc
Som paper1.doc
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
A temporal classifier system using spiking neural networks
A temporal classifier system using spiking neural networksA temporal classifier system using spiking neural networks
A temporal classifier system using spiking neural networks
 
Deep learning algorithms
Deep learning algorithmsDeep learning algorithms
Deep learning algorithms
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
 
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
A New Classifier Based onRecurrent Neural Network Using Multiple Binary-Outpu...
 

Viewers also liked

Machine design lab manual
Machine design lab manualMachine design lab manual
Machine design lab manualbadri21
 
Machine dynamics l1
Machine dynamics l1Machine dynamics l1
Machine dynamics l1Vijay Upari
 
A course in fuzzy systems and control by li xin wang solution manual
A course in fuzzy systems and control by li xin wang solution manualA course in fuzzy systems and control by li xin wang solution manual
A course in fuzzy systems and control by li xin wang solution manualMafaz Ahmed
 
Lab manual psd v sem experiment no 5
Lab manual psd v sem experiment no 5Lab manual psd v sem experiment no 5
Lab manual psd v sem experiment no 5Sachin Airan
 
Lab manual psd v sem experiment no 6
Lab manual psd v sem experiment no 6Lab manual psd v sem experiment no 6
Lab manual psd v sem experiment no 6Sachin Airan
 
Lab manual psd v sem experiment no 1
Lab manual psd v sem experiment no 1Lab manual psd v sem experiment no 1
Lab manual psd v sem experiment no 1Sachin Airan
 
Lab manual psd v sem experiment no 4
Lab manual psd v sem experiment no 4Lab manual psd v sem experiment no 4
Lab manual psd v sem experiment no 4Sachin Airan
 
Lab manual psd v sem experiment no 3
Lab manual psd v sem experiment no 3Lab manual psd v sem experiment no 3
Lab manual psd v sem experiment no 3Sachin Airan
 
Lab manual psd v sem experiment no 2
Lab manual psd v sem experiment no 2Lab manual psd v sem experiment no 2
Lab manual psd v sem experiment no 2Sachin Airan
 
Lab manual psd v sem experiment no 7
Lab manual psd v sem experiment no 7Lab manual psd v sem experiment no 7
Lab manual psd v sem experiment no 7Sachin Airan
 
Power System Modelling And Simulation Lab
Power System Modelling And Simulation LabPower System Modelling And Simulation Lab
Power System Modelling And Simulation LabSachin Airan
 
artificial neural network
artificial neural networkartificial neural network
artificial neural networkPallavi Yadav
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made EasyPrakash Pimpale
 
Introduction to Genetic Algorithms
Introduction to Genetic AlgorithmsIntroduction to Genetic Algorithms
Introduction to Genetic AlgorithmsAhmed Othman
 
Genetic Algorithm by Example
Genetic Algorithm by ExampleGenetic Algorithm by Example
Genetic Algorithm by ExampleNobal Niraula
 

Viewers also liked (15)

Machine design lab manual
Machine design lab manualMachine design lab manual
Machine design lab manual
 
Machine dynamics l1
Machine dynamics l1Machine dynamics l1
Machine dynamics l1
 
A course in fuzzy systems and control by li xin wang solution manual
A course in fuzzy systems and control by li xin wang solution manualA course in fuzzy systems and control by li xin wang solution manual
A course in fuzzy systems and control by li xin wang solution manual
 
Lab manual psd v sem experiment no 5
Lab manual psd v sem experiment no 5Lab manual psd v sem experiment no 5
Lab manual psd v sem experiment no 5
 
Lab manual psd v sem experiment no 6
Lab manual psd v sem experiment no 6Lab manual psd v sem experiment no 6
Lab manual psd v sem experiment no 6
 
Lab manual psd v sem experiment no 1
Lab manual psd v sem experiment no 1Lab manual psd v sem experiment no 1
Lab manual psd v sem experiment no 1
 
Lab manual psd v sem experiment no 4
Lab manual psd v sem experiment no 4Lab manual psd v sem experiment no 4
Lab manual psd v sem experiment no 4
 
Lab manual psd v sem experiment no 3
Lab manual psd v sem experiment no 3Lab manual psd v sem experiment no 3
Lab manual psd v sem experiment no 3
 
Lab manual psd v sem experiment no 2
Lab manual psd v sem experiment no 2Lab manual psd v sem experiment no 2
Lab manual psd v sem experiment no 2
 
Lab manual psd v sem experiment no 7
Lab manual psd v sem experiment no 7Lab manual psd v sem experiment no 7
Lab manual psd v sem experiment no 7
 
Power System Modelling And Simulation Lab
Power System Modelling And Simulation LabPower System Modelling And Simulation Lab
Power System Modelling And Simulation Lab
 
artificial neural network
artificial neural networkartificial neural network
artificial neural network
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made Easy
 
Introduction to Genetic Algorithms
Introduction to Genetic AlgorithmsIntroduction to Genetic Algorithms
Introduction to Genetic Algorithms
 
Genetic Algorithm by Example
Genetic Algorithm by ExampleGenetic Algorithm by Example
Genetic Algorithm by Example
 

Similar to Neural Network Project Identifies Sine & Square Waves

NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsMohid Nabil
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsParidha Saxena
 
Frame detection.pdf
Frame detection.pdfFrame detection.pdf
Frame detection.pdfinfomerlin
 
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptxssuser67281d
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learninggrinu
 
Ann model and its application
Ann model and its applicationAnn model and its application
Ann model and its applicationmilan107
 
APPLIED MACHINE LEARNING
APPLIED MACHINE LEARNINGAPPLIED MACHINE LEARNING
APPLIED MACHINE LEARNINGRevanth Kumar
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
08 neural networks
08 neural networks08 neural networks
08 neural networksankit_ppt
 
Electricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANNElectricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANNNaren Chandra Kattla
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Classification by backpropacation
Classification by backpropacationClassification by backpropacation
Classification by backpropacationSiva Priya
 
Data Science - Part VIII - Artifical Neural Network
Data Science - Part VIII -  Artifical Neural NetworkData Science - Part VIII -  Artifical Neural Network
Data Science - Part VIII - Artifical Neural NetworkDerek Kane
 

Similar to Neural Network Project Identifies Sine & Square Waves (20)

NeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximateProgramsNeuralProcessingofGeneralPurposeApproximatePrograms
NeuralProcessingofGeneralPurposeApproximatePrograms
 
Towards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprogramsTowards neuralprocessingofgeneralpurposeapproximateprograms
Towards neuralprocessingofgeneralpurposeapproximateprograms
 
N ns 1
N ns 1N ns 1
N ns 1
 
Frame detection.pdf
Frame detection.pdfFrame detection.pdf
Frame detection.pdf
 
Unit ii supervised ii
Unit ii supervised iiUnit ii supervised ii
Unit ii supervised ii
 
Lec 6-bp
Lec 6-bpLec 6-bp
Lec 6-bp
 
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
2. NEURAL NETWORKS USING GENETIC ALGORITHMS.pptx
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learning
 
H017376369
H017376369H017376369
H017376369
 
Ann model and its application
Ann model and its applicationAnn model and its application
Ann model and its application
 
APPLIED MACHINE LEARNING
APPLIED MACHINE LEARNINGAPPLIED MACHINE LEARNING
APPLIED MACHINE LEARNING
 
Back propagation
Back propagationBack propagation
Back propagation
 
08 neural networks
08 neural networks08 neural networks
08 neural networks
 
Electricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANNElectricity Demand Forecasting Using ANN
Electricity Demand Forecasting Using ANN
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Neural network
Neural networkNeural network
Neural network
 
Neural networks
Neural networksNeural networks
Neural networks
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
Classification by backpropacation
Classification by backpropacationClassification by backpropacation
Classification by backpropacation
 
Data Science - Part VIII - Artifical Neural Network
Data Science - Part VIII -  Artifical Neural NetworkData Science - Part VIII -  Artifical Neural Network
Data Science - Part VIII - Artifical Neural Network
 

Neural Network Project Identifies Sine & Square Waves

  • 1. Laboratory #6: Neural Network Project Kyle Villano, Tim Kindervatter and Kelvin Resto ELC 343-01 Dr. Jesson The College of New Jersey December 10th, 2014
  • 2. 2 Table of Contents Introduction ............................................................................................................................... 3 Problem Description................................................................................................................... 3 Materials and Procedure ............................................................................................................. 5 Results .........................................................................................................................................5 Discussion .................................................................................................................................... 8 Conclusion ................................................................................................................................... 9 Appendix…....................................................................................................................................11
  • 3. 3 Introduction In this lab we used PSoC 5LP board, along with its creator software in order to create a neural network program. This neural network needed to be able to identify types of signals that are input to the board based on their Fourier transforms, specifically a sine and a square wave. The network would also use a specific weighting system that aids in teaching the neural network to learn how to identify these aforementioned input signals. In order to do this, we wrote C code that would simulate the learning of this neural network, and then implemented it on the PSoC 5LP board. Once the code was written, we used the oscilloscope and waveform generator to produce inputs to the system and observe the corresponding outputs, which would allow us to finally confirm that it indeed functioned in the desired manner. After completing the design of the code, we finished our work by observing that the neural network did indeed correctly identify the two types of input waves. However, our learning algorithm did seem to have some trouble, which we believe can be attributed to the incorrect assigning of weight values to certain nodes. Problem Description The given task in this lab was stated quite simply, that we were to create a neural network program that was able to learn and adapt to identify two types of given input signals. The network would need to be based mainly upon assigning weight values to various nodes, which would then allow the program to be able to learn from its mistakes and correctly identify square and sine input waves after many instances. The lab did give some guiding information, telling us that the network needed 64 input nodes that were obtained by performing a 128-point FFT on the input signal, which would then be connected to 24 points in the hidden layer that allowed for the two output points to be formulated and produced by the board. To teach this board to perform such a task, we needed to create both a training mode (to allow the board to learn what output
  • 4. 4 bits result in which answers) and a identification mode (to verify that it worked in the correct manner). Appendix A attached to the lab sheet provided us with much relevant information regarding how to attack this problem, such as a diagram of the training procedure and methods that we could use to assign weight values to each node in our program. The aforementioned node diagram describing the training procedure can be seen below in Figure 1. Using this information, along with background knowledge and examples of neural networks found online, we became well-equipped to try to solve this complex problem. Figure 1: Illustration of Training Procedure
  • 5. 5 Materials and Procedure Materials Used: • PSoC Creator Software • CY8CKIT-001 with PSoC 5 LP Board • PSoC Programmer • Oscilloscope and corresponding probes • Waveform generator Procedure: • Perform background research on neural networks and view working examples • Create training mode for network based on back-error propagation techniques • Run simulation protocol to allow network to learn and determine weights for nodes • Create identification mode for network to verify accurate functionality • Run network and observe output results to verify success of program Results We first started off this lab by researching neural networks and working examples. In the realm of computer science, artificial neural networks are essentially forms of computer architecture inspired by biological neural networks, such as the brain, and are used to estimate or approximate functions that depend on a large number of inputs that are generally unknown. An artificial neural network is made up of systems of interconnected neurons that can compute values from inputs, and are basically capable of machine learning as well as pattern recognition due to their adaptive nature. The next part of the lab involved creating a trading mode for the neural network using back-error propagation techniques. By giving the network a set of input, it is trained to give a
  • 6. 6 desired response. If the network gives the wrong answer, then the network is corrected by adjusting its parameters so that the error is reduced. During this correction process, one starts with the output nodes and propagation is backward to the input nodes and is repeated until all the weight values converge. For this lab the weight values were calculated by using a C++ program since a PSoC board does not have the capacity to calculate the weights. Even though the weights were calculated on a computer the program took extremely long to run (approximately 4 hours!). The final part of the Lab involved taking a 128-point FFT of a signal and feeding the generated points into the neural network. The PSoC board, programed with the neural network (Figure 2), had to be able to recognize two distinct input signals scaled between 0 and 1. The first signal was a sinusoidal input with a value of +1 for node 0 and -1 for node 1 in the output layer. The other signal was a square wave input with a value of -1 for node 0 and +1 for node 1 in the output layer. When a sine signal was generated from a waveform generator at 1 volt peak to peak at 100 Hz the PSoC board recognized the input signal through its FFT inputs which were fed through the neural network and displayed ‘Sine’ on the LCD screen (Figure 3). Also when a square wave signal was fed to the PSoC board the LCD screen displayed ‘Square’ (Figure 4). Thus our neural network was a success and the board was able to recognize both input signals.
  • 7. 7 Figure 2: Neural network block diagram with ADC converter and LCD Figure 3: Neural network recognizing sine wave
  • 8. 8 Figure 4: Neural network recognizing square wave Discussion In this project, we were instructed to implement a neural network that is able to recognize a sine wave and a square wave, both of unity amplitude. This is accomplished by taking a 128- point FFT of the signal and feeding the generated points into the neural network. This neural network uses a three-layer structure, which means that there is a set of nodes that accepts the FFT points as input, a middle hidden layer with a transfer function that alters the output of the first layer, and finally an output layer that forms the waveform that is desired. The network is told from the outset to expect a certain output from the final layer of nodes given a certain input into the input layer. In our case, we want the output to be the waveform we wish to generate, and we wish the input to be the FFT of that waveform. The learning algorithm uses values called weights to calculate the error in the output nodes (in other words, how close it got to the expected output), and then uses that error value to adjust the
  • 9. 9 weights in an attempt to get closer to the desired output. Each node in the middle and output layers has certain weights associated with the node that it is receiving an input from. We assigned arbitrary weights at first to give the algorithm a starting place. Subsequently, using formulas given to us, we implemented an algorithm in C++ that calculates the error in the output, and then uses that error value to adjust the values of the weights. As a result, every time the learning algorithm is run, it gets closer and closer to the correct output value until all of the error values converge to 0, which means we have the exact output we desire. Our neural net was able to give a reasonable output for the square wave, but was unable to reproduce a sine wave perfectly. In order to fix this, we would need to run the learning algorithm at least one more time to recalculate the weights. Unfortunately, this process takes many hours and we did not have the time to do this. If we had the time, the algorithm would be able to adjust the weights to give an adequate output. Conclusion In this lab, we learned how to work with and write very high-level C code, and how one can simulate and verify these results using the PSoC 5LP board. We also learned how to design a functioning neural network, and how to effectively code and implement such a program in regards to solving microcomputer projects. We also were able to ascertain information regarding various errors that could occur during the coding process, and how to identify and correct them with the software, hardware, and various debugging methods. We were able to incorporate knowledge and code used in many examples that we researched online in completing our project. These newly learned skills (along with knowledge retained from previous projects) then allowed us to code and analyze the two routines that we needed to successfully implement in order to create this neural network. We were also able to understand the purpose of the many functions and variables utilized in our network programs,
  • 10. 10 but perhaps more importantly, we were able to identify the ways in which we needed to alter these components to fit the necessary tasks for our overall project. In performing the project, we were initially overwhelmed with the scope of what we were tasked with doing. However, through much background research and the studying of examples, we learned the various concepts and ideas that a neural network contained, and how to implement these concepts into our C code. We did come across a good amount of trouble when developing the learning algorithm, as the weights assigned to various nodes did not seem to be correctly allowing our network to learn to identify signals, among other things. We were also surprised by the amount of time that the learning routine needed to run in order to be able to correctly identify these input signals, as our program ran routines for several hours before it could be used to identify a square and sine wave. Overall, the neural network is a program with wide-reaching applications in the field of microcomputer systems. For example, neural networks can be used in items ranging from home appliances to entire power grids, as the learning algorithms will self-adjust to allocate power more efficiently based upon observations of usage and intensity. Therefore, it is quite apparent that the knowledge and experiences gained from successfully completing this project will also prove to be useful in the future. This lab is also important in that it helps us understand how neural networks are implemented in various programs, as well as the many applications that this type of network can be used for in order to further our knowledge in this field and help us solve future electrical and computer engineering projects.
  • 11. 11 Appendix: Code for Neural Network #include <device.h> #include <math.h> // Creates the complex structure that will be used throughout the code. There // is one real part and one imaginary part. struct cmpx{double real; double imag;}; // Initializes COMPLEX to complex structure type typedef struct cmpx COMPLEX; // This function performs the FFT of an input sample array void FFT(COMPLEX *Y, COMPLEX *w ) // input sample array { COMPLEX temp1, temp2; //Temporary storage variables int i, j, k; //For loop counter variables int upper_leg, lower_leg, leg_diff; //Index of upper/lower butterfly leg and difference between upper/lower leg int index, step; //Index and step between twiddle factors leg_diff = 64; // starting difference between upper and lower butterfly // leg step = 1; // Step between values in twiddle factors array // for 128-point FFT // Fills up array for(i=0; i<7; i++) { index = 0; for(j=0; j<leg_diff; j++) { for(upper_leg=j; upper_leg<128; upper_leg+=2*leg_diff) { lower_leg = upper_leg + leg_diff; temp1.real = Y[upper_leg].real + Y[lower_leg].real; temp1.imag = Y[upper_leg].imag + Y[lower_leg].imag; temp2.real = Y[upper_leg].real - Y[lower_leg].real; temp2.imag = Y[upper_leg].imag - Y[lower_leg].imag; Y[lower_leg].real = temp2.real * w[index].real - temp2.imag * w[index].imag; Y[lower_leg].imag = temp2.real * w[index].imag + temp2.imag * w[index].real; Y[upper_leg].real = temp1.real; Y[upper_leg].imag = temp1.imag; } index += step; } leg_diff = leg_diff/2; step *= 2; } // bit reversal for re-sequencing data j = 0; for(i=1; i<127; i++)
  • 12. 12 { k = 64; while(k<=j) { j = j-k; k = k / 2; } j = j + k; if(i<j) { temp1.real = Y[j].real; temp1.imag = Y[j].imag; Y[j].real = Y[i].real; Y[j].imag = Y[i].imag; Y[i].real = temp1.real; Y[i].imag = temp1.imag; } } return; } //end of FFT function void main() { /*________ Initialize weights calculated via C++ ji_________*/ double wji[24][64] = {{0.1781, 1.32975, 3.6107, 2.68523, 2.12523, 1.59975, 1.44975, 0.487195, 1.19156, 0.312466, 0.812014, 0.497195, 0.750656, 1.04403, 2.84523, 2.9307, 2.31523, 0.637195, 1.09247, 2.3395, 7.8526, 0.874027, 1.10156, 0.931109, 1.05403, 0.173824, 0.935385, 1.06247, 1.50493, 0.999751, 1.11403, 1.07975, 0.83629, 1.8195, 1.74403, 1.54403, 1.52403, 6.18568, 9.77758, 4.10473, 2.2995, 2.11805, 1.20584, 1.03686, 1.35876, 0.129482, 0.749685, 0.888304, 1.65661, 1.65088, 0.688238, 2.21919, -0.648999, 1.23195, 0.321188, 8.02169, -0.0458608, 1.86566, 0.376475, 2.27742, 1.09371, 1.75371, 0.538238, 0.1405}, {0.595544, 1.5929, 3.96685, 2.86988, 2.33988, 1.6229, 1.0629, 0.552939, 0.658109, 1.13071, 0.554411, 0.882939, 1.3755, 1.55882, 2.35988, 4.39685, 2.36988, 0.792939, 0.470714, 2.6458, 8.62476, 1.62882, 0.808109, 1.34181, 1.67882, 0.679621, 0.56773, 1.28071, 1.16143, 1.6929, 1.51882, 1.4329, 0.420334, 2.2058, 1.80882, 1.07882, 1.50882, 7.78963, 12.5675, 5.13567, 2.8258, 2.58764, 0.974032, 0.616592, 1.08105, -0.52484, -0.0656388, 1.25475, 1.80949, 1.48541, -0.513793, 2.43016, -2.37456, 0.359391, -2.81896, -4.29961, -5.38413, 0.912575, -0.387585, 1.85637, 1.51318, 1.11318, 0.336207, - 1.19228}, {0.41051, 4.36401, 11.2477, 7.43585, 8.04585, 4.44401, 4.38401, 1.12288, 2.76927, 1.7269, 2.40308, 0.912879, 2.86164, 4.18617, 8.10585, 11.5677, 7.48585, 1.54288, 1.3469, 7.28801, 26.3351, 3.54617, 2.13927, 2.68545, 3.76617, 1.04835, 2.98762, 1.6269, 2.8838, 4.29401, 3.65617, 4.43401, 2.02525, 7.62801, 3.33617, 3.66617, 3.80617, 22.7576, 37.7649, 14.5939, 7.45801, 7.17234, 2.35143, 3.00266, 3.21215, -0.978675, 1.76915, 3.80833, 6.33667, 5.85883, 1.28347, 9.32716, -0.984896, 6.33879, 6.81737, 66.7304, 10.8013, 11.8241, 2.67695, 10.4206, 5.48532, 5.65532, 1.63347, -0.497448}, {-0.0241987, 2.00168, 5.52173, 4.0367, 3.6867, 1.99168, 1.63168, 0.920807, 0.991666, 1.17666, 1.04916, 0.810807, 1.30667, 1.69833, 3.7967, 5.45173, 3.8167, 0.690807, 0.56666, 2.95335, 12.7018, 1.22833, 1.71167, 0.964168, 1.96833, 0.139152, 1.22082, 0.73666, 1.77332, 2.43168, 1.68833, 1.68168, 0.825812, 3.01335, 1.24833, 2.14833, 1.78833, 9.76011, 17.3402, 6.89006,
  • 13. 13 3.82335, 2.70665, 1.05831, 0.498274, 1.52247, -1.46095, 0.268222, 1.29498, 2.73995, 2.3466, -0.47848, 3.31157, -2.66199, 0.348068, -3.3624, -5.89371, - 6.82632, 1.22462, -0.91696, 2.18309, 1.05655, 1.07655, -0.27848, -1.07099}, {0.0293048, 1.08133, 2.41171, 1.97152, 1.91152, 0.901332, 1.05133, 0.503343, 0.883256, 0.169219, 0.806237, 0.283343, 0.567294, 0.562475, 1.85152, 2.82171, 2.02152, 0.903343, 0.749219, 1.55266, 4.77247, 0.682475, 1.15326, 0.970275, 0.872475, 0.498162, 1.19142, 0.709219, 0.988437, 1.03133, 0.582475, 1.47133, 0.10738, 1.76266, 1.08247, 1.11247, 0.432475, 3.82456, 6.58532, 2.91418, 1.29266, 1.66495, 1.0744, 0.365904, 1.1966, -0.26498, -0.0806674, 1.07362, 1.08724, 1.29838, -0.368382, 1.422, -1.76695, -0.436574, -2.76191, -10.0946, -4.88543, -0.224767, -1.10676, 0.503615, 0.831807, 0.321807, -0.308382, - 0.173476}, {0.476395, 3.02602, 7.34247, 5.40425, 4.60425, 2.76602, 2.59602, 0.338041, 1.64272, 1.09108, 1.5019, 0.548041, 1.60437, 2.6038, 4.76425, 7.39247, 5.21425, 0.558041, 0.831078, 4.97203, 17.1454, 2.2338, 1.50272, 1.66355, 2.5138, 0.418609, 1.68133, 1.18108, 2.02216, 2.06602, 2.0638, 2.77602, 1.44969, 4.93203, 2.3638, 1.7838, 2.2338, 13.6027, 23.4157, 8.63628, 4.77203, 3.4376, 1.32051, 1.30716, 2.01076, -1.37468, -0.219482, 2.00159, 3.10318, 2.73096, -0.60391, 3.72255, -3.73605, 1.08041, -5.22955, -7.68152, -9.59519, 1.49473, -1.93782, 2.81864, 2.28432, 1.91432, -0.16391, -1.36803}, {0.751677, -0.966573, -5.1527, -2.44964, -2.59964, -1.16657, -0.586573, 0.399064, -0.421347, 0.0612656, -0.570041, 0.269064, -0.80396, -1.04008, - 2.42964, -4.4227, -2.75964, 0.409064, 0.301266, -2.59315, -12.095, -1.20008, -0.611347, -0.252654, -1.19008, 0.555185, 0.213838, -0.358734, -0.827469, - 0.656573, -1.26008, -1.03657, 0.0764507, -3.07315, -0.850082, -1.05008, - 0.880082, -9.70891, -16.6912, -5.67278, -2.18315, -1.86016, -0.464856, - 0.210607, 0.107716, 1.64413, 0.298868, -0.45359, -1.62718, -1.01069, 0.841851, -2.10428, 4.20677, -0.00936273, 4.27926, 3.9924, 7.53666, - 0.880577, 1.5937, -1.27243, -0.691214, -0.941214, 1.11185, 2.48838}, {-0.230998, 2.45367, 5.67823, 4.17595, 3.81595, 2.40367, 2.02367, 0.121459, 1.33876, 0.9463, 1.09253, 0.441459, 1.76121, 1.83506, 4.62595, 6.50823, 3.89595, 0.0714588, 1.1863, 3.49734, 13.7974, 1.59506, 1.01876, 1.18498, 2.05506, 0.117615, 1.74637, 0.5363, 1.2926, 1.96367, 2.22506, 2.11367, 0.473915, 4.26734, 2.36506, 1.62506, 2.01506, 11.2179, 19.087, 7.06329, 4.10734, 3.49011, 1.08014, 0.739218, 1.06022, -1.35406, -0.216621, 1.48644, 2.71289, 2.69427, -0.413847, 3.84072, -2.81998, 0.754588, -3.67923, -6.64332, -7.49462, 2.02302, -1.17769, 2.20687, 1.19844, 1.55844, -0.383847, -1.68499}, {0.198232, 1.97458, 5.1803, 3.15244, 3.93244, 2.20458, 1.57458, 0.677804, 1.25544, 1.06587, 1.41065, 0.697804, 1.17501, 2.14131, 4.07244, 5.3003, 3.78244, 0.587804, 0.835867, 3.55917, 11.4417, 1.55131, 1.09544, 1.88023, 1.42131, 0.911509, 0.796949, 0.755867, 1.37173, 2.01458, 1.74131, 2.28458, 0.507377, 3.79917, 2.13131, 2.13131, 1.90131, 10.2773, 16.3188, 6.09161, 3.45917, 2.70261, 1.26216, 1.25148, 0.963244, -0.578152, 0.201646, 1.77803, 2.21606, 2.41278, 0.0650921, 3.18081, -1.46768, 1.71804, -1.35454, 8.02823, - 2.00417, 2.531, -0.409816, 3.70591, 1.99295, 2.00295, 0.525092, -0.438838}, {0.21548, 1.57082, 3.60625, 2.12353, 2.61353, 1.83082, 1.11082, 0.094023, 0.573733, 1.08519, 1.06446, 0.064023, 1.12228, 1.49892, 2.84353, 3.59625, 2.27353, 0.124023, 0.89519, 1.79164, 8.26711, 1.40892, 1.35373, 0.923005, 1.46892, -0.0226243, 0.691109, 0.63519, 1.14038, 1.05082, 0.748923, 1.08082, 0.952566, 2.35164, 1.00892, 1.13892, 1.18892, 6.4606, 10.9215, 4.35517, 2.15164, 1.67785, 1.01184, 0.873236, 0.517756, -0.743999, -0.252451, 0.897028, 1.73406, 0.72216, -0.576243, 1.78919, -2.4152, -0.59977, -4.14121, -17.3341, -8.82618, -0.543298, -1.13249, 0.212945, 0.556472, 0.0764724, - 0.756243, -1.2526}, {0.363154, 2.2314, 6.0747, 3.57805, 3.60805, 2.0114, 2.3514, 0.650484, 1.60673, 1.0094, 1.05307, 1.01048, 1.37406, 2.11614, 3.74805, 6.2047, 4.14805, 0.870484, 1.0594, 3.68279, 12.7213, 1.94614, 1.83673, 1.5304,
  • 14. 14 2.26614, 0.828409, 1.41514, 0.849405, 1.91881, 2.4014, 2.04614, 2.0314, 0.877814, 3.62279, 1.92614, 1.90614, 2.12614, 10.6341, 18.7207, 7.12084, 3.92279, 3.15228, 0.921479, 1.60037, 1.96722, -0.703124, 0.884606, 1.82088, 2.67177, 2.44651, 0.854095, 3.7774, -1.43846, 2.48484, 0.0804741, 13.8304, - 0.323147, 3.50559, 0.49819, 3.98149, 2.65075, 2.03075, 0.814095, -0.764231}, {1.04803, -1.52188, -5.64641, -3.83914, -3.57914, -1.27188, -1.26188, 0.172579, -0.130978, 0.264474, 0.141748, 0.762579, -1.16643, -1.0465, - 3.78914, -5.90641, -3.47914, 0.262579, 0.104474, -3.55377, -15.0855, -1.2565, -0.330978, -0.363704, -1.1065, 0.492653, -0.518326, -0.175526, -0.451052, - 1.59188, -0.936504, -1.59188, 0.217127, -3.46377, -0.986504, -1.3265, - 1.5865, -12.1474, -20.8565, -7.09291, -3.71377, -2.80301, -0.0555994, - 0.250368, -0.148399, 1.99493, 1.12577, -0.701125, -2.26225, -1.73687, 1.74653, -3.068, 4.90031, 0.145791, 4.96263, 5.26085, 9.22874, -0.944945, 2.54305, -2.00147, -1.25074, -0.970735, 1.83653, 2.78016}, {0.400277, 1.35833, 3.89819, 2.53326, 2.63326, 1.29833, 1.36833, 0.787264, 1.26435, 0.407365, 0.560859, 0.667264, 0.801339, 0.721718, 2.19326, 3.34819, 2.43326, 0.0472637, 1.11737, 2.06665, 7.65793, 0.881718, 1.09435, 1.03785, 1.58172, 0.246885, 0.451237, 1.16737, 1.08473, 1.82833, 1.34172, 1.14833, 0.61425, 2.36665, 1.60172, 0.881718, 0.761718, 6.09978, 9.85952, 4.48991, 2.31665, 1.59344, 0.887744, 0.871893, 0.701616, -0.282764, -0.257932, 1.34511, 0.980218, 0.83361, -0.611149, 1.86872, -2.43723, 0.222637, -4.02574, -12.9666, -7.32034, -0.223578, -1.5623, 1.19757, 0.363785, 0.983785, - 0.761149, -0.718616}, {0.279085, 2.14787, 4.59434, 2.9761, 2.8961, 2.04787, 1.56787, 0.148732, 0.798572, 0.428925, 1.28875, 0.568732, 1.85822, 1.9675, 2.7661, 4.74434, 2.4661, 0.638732, 0.978925, 2.57573, 9.32728, 1.7375, 1.05857, 1.3084, 1.9075, 0.479454, 0.938026, 0.428925, 1.02785, 1.50787, 1.6875, 1.82787, 0.708379, 2.83573, 1.1975, 1.7675, 1.4275, 7.94831, 13.3412, 5.31183, 2.94573, 2.57499, 0.508203, 1.16639, 1.2473, -0.542763, 0.73528, 1.37713, 2.58425, 1.77389, 0.624541, 2.69101, -0.659153, 1.21732, 0.182706, 9.57673, - 0.32913, 3.27009, 0.579082, 3.24555, 1.53278, 1.63278, 0.414541, -0.0745767}, {0.801652, 2.05456, 4.80349, 3.21903, 3.12903, 1.54456, 2.24456, 1.00454, 1.46878, 1.31589, 1.03733, 0.344544, 1.23167, 2.19467, 3.53903, 5.37349, 3.00903, 0.554544, 0.615887, 3.67913, 10.9513, 2.07467, 1.45878, 1.01023, 2.06467, 0.471549, 1.72033, 0.645887, 1.81177, 1.82456, 2.05467, 2.20456, 0.607436, 2.98913, 1.37467, 1.58467, 1.64467, 8.65708, 14.9249, 5.72815, 3.59913, 3.07933, 0.918882, 0.794976, 1.52332, 0.0621664, 0.325284, 1.77477, 2.46954, 2.13964, 0.68549, 3.86441, -0.493481, 1.89544, 0.987451, 14.8684, 0.949411, 3.92539, 0.0409802, 3.7499, 1.83995, 2.35995, 0.52549, -0.31174}, {-0.203802, 1.85697, 4.77407, 3.05052, 3.26052, 1.21697, 1.76697, 0.544908, 1.29955, 1.21084, 1.0502, 0.264908, 1.71826, 1.54039, 3.55052, 4.12407, 3.27052, 0.644908, 1.29084, 3.03394, 10.5283, 1.84039, 1.10955, 1.53891, 1.67039, 0.412776, 1.00233, 0.360841, 0.991683, 2.13697, 1.41039, 1.53697, 0.883618, 2.58394, 1.79039, 1.24039, 1.21039, 8.12156, 14.1058, 5.82446, 2.61394, 2.31079, 0.562973, 0.410657, 1.51446, -1.2167, -0.0790792, 1.32381, 2.26763, 1.46105, -0.462236, 2.79486, -1.78802, 0.729077, -2.93118, -4.95844, -5.90013, 1.61039, -0.714473, 1.87263, 1.51131, 0.781313, 0.0877636, - 0.999011}, {-0.30755, 1.82211, 5.16712, 3.37461, 3.82461, 1.69211, 1.60211, -0.081049, 1.3691, 0.732602, 0.645853, 0.398951, 1.4656, 1.27171, 3.07461, 4.77712, 3.36461, 0.138951, 1.1826, 2.95421, 11.1271, 1.23171, 0.679103, 0.962354, 1.05171, 0.46285, 1.02195, 1.2226, 1.2852, 1.61211, 1.19171, 2.21211, 0.985452, 3.44421, 1.30171, 1.70171, 1.22171, 9.42384, 16.1739, 5.50882, 2.69421, 1.93341, 0.918703, 0.490504, 0.828054, -1.53955, -0.380697, 1.4413, 1.28261, 1.50221, -1.6615, 1.50351, -5.5155, -1.97049, -9.47749, -38.3379, - 17.7335, -1.69948, -3.833, 0.0220166, -0.0389917, 0.0210083, -1.3515, - 2.61775},
  • 15. 15 {0.144587, 1.05116, 4.31508, 2.75312, 2.92312, 1.91116, 1.64116, 0.94698, 1.54637, 0.583981, 0.970177, 0.51698, 0.858766, 1.28035, 2.49312, 3.58508, 2.31312, 0.69698, 0.743981, 2.22232, 8.91294, 1.69035, 0.886373, 1.57257, 1.64035, 0.385392, 1.13176, 0.363981, 1.43796, 1.22116, 1.08035, 1.06116, 0.509372, 2.20232, 1.48035, 1.80035, 1.88035, 6.74936, 11.9572, 5.21544, 2.43232, 2.03071, 0.775569, 0.94794, 1.23335, -0.129436, 0.775526, 1.44955, 1.7691, 2.44829, 0.713917, 2.33784, -0.534129, 1.1498, 0.699585, 9.66343, 0.0452529, 2.66568, 0.347834, 2.44176, 1.74588, 1.71588, 0.153917, 0.162935}, {0.284495, 2.54727, 6.65784, 4.55256, 4.76256, 2.15727, 2.35727, 0.235552, 1.49516, 1.3841, 1.32463, 0.0855521, 1.83621, 1.63925, 4.59256, 6.60784, 4.00256, 0.475552, 1.1841, 4.41454, 14.959, 2.05925, 1.20516, 2.12568, 1.53925, 0.582512, 1.03767, 1.1841, 1.3282, 2.16727, 1.93925, 2.73727, 0.546609, 3.67454, 2.05925, 2.44925, 1.89925, 11.8177, 20.3088, 8.4471, 4.55454, 3.71851, 1.63714, 0.965202, 1.85071, -1.25559, 0.631151, 2.15124, 2.51247, 2.70445, -0.184883, 3.88569, -3.83505, 0.195521, -4.26442, -7.13426, -8.07395, 2.04592, -1.06977, 3.21081, 1.2704, 1.4404, -0.094883, -1.80753}, {0.253328, 1.40887, 3.36352, 2.85119, 2.62119, 1.31887, 1.69887, 0.563793, 0.617942, 1.01748, 1.29771, 1.01379, 1.14841, 1.69542, 2.98119, 3.69352, 2.57119, 0.763793, 0.457477, 2.63774, 8.28281, 1.59542, 1.44794, 1.01817, 1.34542, 0.76678, 0.834722, 0.917477, 0.724954, 1.16887, 1.66542, 1.42887, 0.914257, 2.93774, 1.08542, 0.985419, 1.15542, 7.16358, 11.2329, 4.38894, 2.57774, 2.16084, 0.69449, 1.46506, 0.921734, -0.583933, 0.604706, 1.49197, 2.30393, 1.96048, 0.317801, 2.58245, -1.01672, 1.17793, 0.199006, 7.16155, - 1.06979, 1.97805, 0.145603, 2.13025, 1.32012, 1.19012, 0.497801, 0.24664}, {0.0594374, 1.91498, 3.54606, 2.77052, 3.03052, 1.70498, 1.90498, 0.480546, 0.662766, 0.471658, 0.812212, 0.450546, 1.09387, 1.75442, 2.38052, 3.56606, 2.69052, 0.630546, 0.931658, 2.90996, 9.16822, 1.39442, 1.13277, 1.68332, 1.55442, 0.929996, 0.752762, 1.27166, 0.873316, 1.84498, 1.79442, 1.38498, 0.951654, 3.03996, 1.73442, 1.29442, 1.45442, 7.45157, 12.3637, 4.98049, 2.82996, 2.91885, 1.32221, 1.50275, 1.48331, 0.206646, 1.20107, 1.00387, 2.10773, 2.07717, 0.869957, 2.68104, -0.285626, 1.60546, 0.239787, 11.3587, 0.329616, 2.77095, 0.409915, 2.861, 1.3255, 1.4455, 0.749957, 0.0871872}, {-0.214275, 1.9441, 5.81564, 3.70487, 3.21487, 2.0841, 1.5441, 0.72388, 1.37779, 1.33963, 1.20871, 0.39388, 1.94594, 1.74742, 3.66487, 5.07564, 3.20487, 0.42388, 0.939633, 3.86819, 12.3887, 1.91742, 1.23779, 1.05687, 1.28742, 0.772401, 1.25019, 0.609633, 0.949266, 1.7341, 2.03742, 1.6141, 1.14203, 3.72819, 1.93742, 2.03742, 1.67742, 10.2146, 16.7277, 6.12307, 3.60819, 2.54484, 0.991112, 1.00739, 1.46167, -0.567656, 0.507364, 1.69074, 2.28149, 1.95481, -0.0759887, 2.79556, -3.23275, 0.288796, -3.57994, - 5.64473, -6.9739, 1.88358, -1.31198, 1.91957, 1.02478, 1.00478, -0.515989, - 0.731375}, {0.560464, 2.54543, 5.98058, 3.803, 4.203, 1.82543, 2.04543, 0.489979, 1.2164, 1.37688, 0.681641, 0.0699794, 2.09591, 2.11328, 4.193, 5.54058, 3.523, 0.479979, 0.916884, 3.25086, 13.2709, 2.02328, 0.996399, 1.63116, 1.84328, 0.302611, 1.44901, 1.10688, 1.38377, 2.50543, 1.81328, 1.92543, 1.27949, 3.37086, 2.03328, 1.96328, 1.75328, 10.449, 17.7993, 7.39386, 3.23086, 2.75656, 1.41425, 1.44684, 1.79638, -1.10027, -0.0995976, 1.06114, 2.15227, 2.45012, -0.773891, 3.49126, -3.23536, 0.329794, -4.09945, -6.07687, -7.26502, 2.01348, -1.21778, 1.99737, 1.84368, 1.01368, -0.473891, -1.04768}, {0.0262033, 0.773105, 2.69417, 2.13364, 1.75364, 1.18311, 1.12311, 0.298309, 0.998893, 0.486787, 0.54784, 0.548309, 1.421, 1.03568, 2.07364, 3.26417, 1.63364, 0.648309, 0.806787, 1.83621, 6.44629, 1.40568, 0.748893, 0.939946, 1.34568, 0.843628, 0.822521, 0.856787, 1.30357, 1.60311, 0.94568, 0.833105, 0.160415, 2.13621, 1.33568, 0.81568, 0.85568, 4.70091, 8.29303, 3.78985, 1.90621, 1.94136, 0.761468, 0.553406, 0.697203, -0.770921, 0.0711313, 0.928256, 1.60651, 0.639086, -0.603718, 1.72734, -1.19797, -0.0569066, -
  • 16. 16 3.03859, -9.87563, -5.60347, 0.449905, -0.887437, 0.593624, 0.886812, 1.00681, -0.203718, -0.308984}}; /*_________End Weights ji______________________*/ /*___________Begin Weights kj__________________*/ double wkj[2][24]={{0.0644177, -1.66273, -3.62619, -10.4896, 0.16428, - 12.4576, 8.8171, -12.322, -0.289505, -0.392423, -0.490231, 14.4188, - 0.747429, 0.102296, 0.187079, -2.74119, -3.31169, -0.201028, -18.5548, 0.162079, -0.578729, -8.16069, -11.4766, -0.315152}, {0.558525, 2.07238, 3.5664, 11.1896, 1.20781, 12.7963, -7.48796, 12.2755, 0.781586, 1.76665, 0.98428, -12.8118, 1.69934, 1.06168, 0.817385, 3.44866, 3.45631, 0.487867, 18.3783, 0.579047, 0.610616, 9.05575, 11.7729, 1.15769}}; /*_____________End Weights kj___________*/ ADC_Start();// Begin analog to digital converter ADC_StartConvert();// Begin analog to digital conversion LCD_Char_Start(); //Start LCD display LCD_Char_ClearDisplay();// Clear LCD display LCD_Char_Position(0,0); // Set LCD character position to point 0,0 double magnitude[64]; COMPLEX wave1[128]; double xi[64];//values in nodes in layer 1 (switches for signal 1/2) double xj[24];//values in nodes in layer 2 double xk[2];//values in nodes in layer 3 double sj[64];//intermediate sj double sk[24];//intermediate sk int i;//nodes in layer 1 int j;//nodes in layer 2 int k;//nodes in layer 3 int a;//variable for magnitude int b = 0;//switch reading variable int c = 0;//switch reading variable int d = 0;//switch reading variable double sumj = 0;//actual values of sj double sumk = 0;//actual values of sk // GENERATES TWIDDLE CONSTANTS // 64 complex points double arg; COMPLEX z[64]; arg = 2 * 3.141592654 / 128; for(i=0; i<64; i++) { z[i].real = cos(i * arg); z[i].imag = -sin(i * arg); } //end initialization
  • 17. 17 for(;;) { /*______LCD Message_____________*/ LCD_Char_PrintString("Press to Start");// Displayed on LCD on start up while(d == 0) { c = Pin_1_Read(); b = Pin_2_Read(); if (c != b) { d = 1; } } d=0; while(d == 0) { c = Pin_1_Read(); b = Pin_2_Read(); if (c == b) { d = 1; } } LCD_Char_ClearDisplay(); d = 0; wave1[0].real = 1.0*ADC_GetResult16(); CyDelay(1); /*______________Wave 1_____________*/ for(i = 0; i < 128; i++) { wave1[i].real = 1.0*ADC_GetResult16(); if(wave1[i].real < -100) { wave1[i].real =1.0- (-wave1[i].real)/100.0; } else if(wave1[i].real > -100) { wave1[i].real =1.0- (wave1[i].real)/100.0; } CyDelayUs(15); wave1[i].imag = 0; } /*_________End Wave 1_______________*/ CyDelay(10); /*_________FFT Input Signal**********/ FFT(wave1, z); for(a = 64; a <128; a++)//calculate the magnitude with 1/2 the 128 points { magnitude[(-64 + a)] = sqrt(wave1[a].real*wave1[a].real+wave1[a].imag*wave1[a].imag); } for(i = 0; i <64; i++) { xi[i] = magnitude[i];
  • 18. 18 } /*___________________End FFT_____________*/ CyDelay(10); /*________________Calculate Outputs____________*/ for (j = 0; j <24; j++) { sumj = 0; for (i = 0; i < 64; i ++) { sumj = sumj + wji[j][i]*xi[i];//finding the sum of wji*xi } sj[j] = sumj/64.0;//putting the sum sj into its array dividing by 64 xj[j] = tanh(sumj/64.0);//xj = the tanh of sj } for (k = 0; k < 2; k++) { sumk = 0; for (j = 0; j < 24; j++) { sumk = sumk + wkj[k][j]*xj[j];//find the sum of wkj*xj } sk[k] = sumk/24.0;//putting the sum sk into its array dividing by 24 xk[k] = tanh(sk[k]/24.0);//xk = the tanh of sk } for(k = 0; k <2; k++) { if (xk[k] < 0) { xk[k] = -1;//rounding function down } else if(xk[k] > 0) { xk[k] = 1;//rounding function up } } /*__________End Calculations*___________*/ CyDelay(10); /*__________Output waveform__________*/ if((xk[0] == 1)&&(xk[1] == -1)) { LCD_Char_ClearDisplay(); LCD_Char_PrintString("Sine");// If sine wave is inputted via wave form //generator than LCD will display Sine } else if ((xk[0] == -1)&&(xk[1] == 1)) { LCD_Char_ClearDisplay(); LCD_Char_PrintString("Square");// If square wave is inputted via wave form //generator than LCD will display Square }
  • 19. 19 } /*_______________End Output______________*/ /*___________LCD Message____________*/ while(d == 0){ c = Pin_1_Read();// Read pin 1, store value in c b = Pin_2_Read();// Read pin 2, store value in b if (c != b){ d = 1; } } d=0; while(d == 0){ c = Pin_1_Read();// Read pin 1, store value in c b = Pin_2_Read();// Read pin 2, store value in b if (c == b){ d = 1; } } LCD_Char_ClearDisplay();// Clear LCD screen d = 0; /***********End LCD Message**********/ } return; }