SlideShare a Scribd company logo
T.O.M 3.0 the Tom who’s finally going to catch Jerry!
By:
Team 1
Amit Bhakta
Kayla Lovelady
Michael Bruins
For
ME 4543.002 Mechatronics (Lab)
University of Texas at San Antonio
College of Engineering
San Antonio, Texas
Assignment Date
April 5th, 2013
Assignment Due Date
May 7th, 2013
Instructor
Brent M. Nowak, Ph.D., Associate Professor
Mechanical Engineering Department
II. Executive Summary
An autonomous vehicle (AV) is a vehicle that is able to operate independent of human
interaction. It uses sensors to examine its environment, and the AV controlled through feedback
from those sensors. The challenge that has been presented to our group is to design, build and
test an autonomous vehicle, which will locate and track down a target (RC car) with red color.
The project has two parts, tracking a non-moving red mouse and tracking a moving red mouse
(in form of RC car). The 4 wheel AV will drive through a court and avoid obstacles as it
attempts to catch the dead and living mouse scenarios.
The goal for the first part of the project was to determine the safe distance for our ping sensor to
operate for the AV to avoid objects and walls. The distance between objects is determined by the
use of Paralax Ping Sensor, which acts as an ultrasonic distance sensor. The sensor emits and
recives an ultrasoics signal which is defind by the pulse rate set by the user. That pulse rate then
is convered to distance from the return signal that is recived by the sensor.
From random stationary position, even if the bot is enclosed by 3 walls with small gap the bot
should be able to find its way out by rotation the sensor and go into “Proximity check mode.” In
this mode, the AV will be searching for a path to travel without obstructions and simultaneously
scanning for objects with the color red to track.
The second goal of the project was to track an object, the mouse (RC car), based on the color red
with the CMUcam4. The CMUcam4 vision sensor will be used by the cat in order to locate the
object with red color. It will accomplish this through the use of embeded camera module that
will search for the the color that will be given certin MIN and MAX value of the color. The
camera operates by identifying its frequency, which is unique to each color.
The sensors will be connected to an Arduino Mega 2560, which is an open source platform that
can receive input from varying types of sensors. Once the input has been received, the 2560 will
interpret the data and assign a corresponding output to one or all four of the motors. The
input/output dynamic is accomplished by an embedded microprocessor on the 2560. It will be
governed by a user defined, group member, set of instructions, code, which will be written in a C
based programming language.
Proior to construction of the physical model, the AV was modled in MATLAB simulink as a
feedback control system by ulitzing a mathematical model. Using this analysis helps us
determine and describes its movement.
Once the AV was constructed, the code was to be generated and compiled to meet specifications.
The obstacle maneuvering code was first to be written and compiled. This was tested in the
Mechatroics lab and fine tuned untill optimal performace was achieved. The code for ping with
servo mount and CMUcam4 were tested individually then all codes were integrated and tested.
Table of Contents
III. Introduction ............................................................................................................................5
Subsystem 1:..................................................................................................................................6
Subsystem 2:..................................................................................................................................6
Subsystem 3:..................................................................................................................................6
Figure 1: DC motor attached to the gears .......................................................................................... 6
V.2 State Space.............................................................................................................................12
V.2 Mathematical Model for 2 and 4 WD.......................................................................................13
Discussion of Camera .......................................................................................................................16
V.3 Integration .............................................................................................................................17
Based on a global and current loop iteration time.............................................................................. 17
Ping Sensor..................................................................................................................................... 17
Modes and states for the motors ....................................................................................................... 18
Arduino to Arduino I 2
C communication............................................................................................ 19
V.5 Simulink Model.....................................................................................................................23
VI. Discussion of Data......................................................................................................................25
VI.1 Discussion of Results of the Simulink model...........................................................................25
VI.2 Discussion of Data.............................................................................................................29
VI.3 Simulink Model ....................................................................................................................29
Experimental................................................................................................................................29
VII. Conclusions ..........................................................................................................................30
Table 4: Trial Run For Ping sensor with CMUcam4........................................................................31
VIII. Appendix ..............................................................................................................................31
X.1 List of Figures ...................................................................................................................31
Figure 15 - MatLab model of step response....................................................................................... 36
Figure 16 PD Controller (Simulink model)........................................................................................ 36
Figure 17 Simulink Auto-tune PD Control with Step Response............................................................ 37
Reference.........................................................................................................................................39
Reference [1] – Pittman Motors. Chapter 8.....................................................................................39
Figure 1: DC motor attached to the gears.............................................................................................6
Figure 2 Simulink Kinematic Model..................................................................................................23
Figure 3 Model of Subsystem............................................................................................................24
Figure 4 X and Y Graphs for a Goal of x=0 and y=0 ...........................................................................26
Figure 5 Velocity Graph for a Goal of x=0 and y=0 ............................................................................27
Figure 6 X and Y Graphs for a Goal of x=5 and y=5 ...........................................................................28
Figure 7 CMUcam4 4 .......................................................................................................................31
Figure 9 Arduino Mega 2560.............................................................................................................32
Figure 8 Parallax Ultrasonic Ping Sensor............................................................................................31
Figure 10 I2C Communication Arduino Schematic .............................................................................32
Figure 11 Wiring and Power Supplies Configuration...........................................................................33
Figure 12 Wiring Configuration of the Arduino Board and Motor Shield..............................................33
Figure 13 State Transition Diagram ...................................................................................................35
Figure 14 State-flow Chart in Simulink ..............................................................................................35
Figure 15 - MatLab model of step response........................................................................................36
Figure 16 PD Controller (Simulink model) .........................................................................................36
Figure 17 Simulink Auto-tune PD Control with Step Response............................................................37
Figure 18 CMUcam4 Confidence Threshold.......................................................................................37
Figure 19 RGB color space ...............................................................................................................38
Figure 20 YUV ColorSpace...............................................................................................................38
Table 1: Motor Model & Specs: 130 Motor ..........................................................................................8
Table 2: requirements T.O.M 3.0.........................................................................................................9
Table 3:Power Budget.......................................................................................................................15
Table 4: Trial Run For Ping sensor with CMUcam4............................................................................31
III. Introduction
The control system implemented for this project was to control a coupled three degree-of-
freedom (3DOF) motion system. For this project we have chosen to manipulate the motors’ speed in
forward and backward directions on a mobile platform, a Parallax Ultrasonic Ping Sensor mounted on a
servo to scan left and right, and a CMUcam4 to detect a color. The Parallax Ultrasonic Ping Sensor to
measure distance to any object in its sight range. Using the position measured from the ping sensor, it
controls the speed and direction of the motors in order to avoid objects. The servo motor that the ping
sensor is attached to is also controlled by the ping sensor. Depending on the distance of an object, the
servo motor is capable of staying centered and rotating 45o
left and right from the centered position to
scan the area around it. The CMUcam4 simultaneously detects the input color within a specified pixel
density threshold and confidence. Based on the detection of the color, the motor platform will then
transition into “tracking mode”, and follow the detected colored object. For the dead mouse, the AV will
run into the non-moving RC car,and for the moving mouse the AV will follow it then finally ram into the
mouse at a higher speed.
PD controllers in a position and velocity based system are used to manipulate velocity and
position error via the use of two independent gains, Kv and Kp (Eq 1 and Eq2), respectively. Simply put,
the position error of the plant is the difference between the current position and the target position. The
derivative error, or velocity error, is defined as the position error over the loop time. One thing to
remember with these definitions is that if the vehicle over shoots the distance, there will be a negative
error. With most motors this will cause problems because the system does not recognize a negative motor
speed, just a rotational direction. To absolve these errors a series of if statements were implemented to
change the motor direction based on the positive or negative state of the calculated error.
As previously stated the motor velocity is the part of the plant that is to be manipulated. Simply
leaving the motor speed constant would result in an un-damped system that would oscillate forward and
backward over the target distance. The controller in this system adds an artificial dampening that will
slow the motor as a function of the measured error in distance. To do this, a ratio of the total error, sum of
the proportional and derivate error,and target distance changes the motor speed. If the error is measured
as zero, the motor will receive and input of zero as its speed, effectively stopping the vehicle. Likewise if
the error is larger, the vehicle will accelerate until it comes in range of an object.
GreenLED will
indicate that
the target has
beenacquired
Subsystem 1:
Input/Output - Position. These share a transfer function in form of gain, Kpot which is the ratio
between Θ𝑖( 𝑠),input angles, to a voltage 𝑉𝑖( 𝑠). Where Θ𝑖( 𝑠) = 𝑛𝜋 and 𝑉𝑖( 𝑠) = 𝑉.
𝑉𝑖( 𝑠)
Θ𝑖( 𝑠)
= 𝐾p Equation 1-Position Gain
Subsystem 2:
Arduino constant gain K. This takes an input signal voltage 𝑉𝑒( 𝑠)and outputs a signal voltage 𝑉𝐴( 𝑠) for
the DC motor to use.
𝑉𝐴 ( 𝑠)
𝑉𝑒( 𝑠)
= 𝐾
Equation 2-Arduino Constant Gain
Subsystem 3:
DC Motor attached to the Gears and Load (Antenna). In order to analyze the complete subsystem, it’s
necessary to do it from the inside out using KVL related to input voltage to the motor and from the motor
to the position of the armature. Below is the schematic of the motor.
Figure 1: DC motor attached to the gears
KVL analysis resulted in:
𝑅 𝑎 𝐼𝑎(𝑠) + 𝐿 𝑎 𝑠𝐼𝑎(𝑠) + 𝑉𝑏(𝑠) = 𝑉𝐴 (𝑠) Equation 3-KVL
The back-emf is proportional to the angular velocity of the motor’s rotor; Kb is the back-emf constant.
𝑉𝑏 = 𝐾𝑏 𝑠𝜃 𝑚 (𝑠) Equation 4-Motor Velocity
The torque developed is a function of the motor-torque constant Kt and the armature current 𝐼 𝑎(𝑠):
𝑇 𝑚( 𝑠) = 𝐾𝑡 𝐼𝑎(𝑠) Equation 5-Motor Torque
The effective inertia load (Jm) on the motor is:
𝐽 𝑚 = 𝐽𝑎 + 𝐽𝐿 (
𝑁1
𝑁2
)
2
Equation 6-Effective Inertia
The effective damping (Dm) is:
𝐷 𝑚 = 𝐷 𝑎 + 𝐷 𝐿 (
𝑁1
𝑁2
)
2
Equation 7-Effective Damping
Using Newton’s second law for rotation:
𝑇 𝑚( 𝑠) = 𝐽 𝑚 𝜃 𝑚 ( 𝑠) 𝑠2
+ 𝐷 𝑚 𝜃( 𝑠) 𝑠 Equation 8
Angular displacement and input voltage 𝑉𝑎( 𝑠) is:
𝛩 𝑚(𝑠)
𝑉𝑎(𝑠)
=
𝐾𝑡
𝑅 𝑎 𝐽 𝑚
[𝑠2 + 𝑠 {
1
𝐽 𝑚
(𝐷 𝑚 +
𝐾𝑡 𝐾𝑏
𝑅 𝑎
)}]
Equation 9-Angular
Displacement over input
voltage
The back-emf constant is:
𝐾𝑒 =
𝑉𝑎
𝜔 𝑛𝑜𝑙𝑜𝑎 𝑑
Equation 10-Back emf Constant
The torque constant Kt is:
𝐾𝑡 =
𝑇 𝑚
𝐼𝑎
Equation 11-Torque Constant
The gear ratio constant, R1 is:
𝑅1 =
𝑁1
𝑁2
Equation 12-Gear Ratio Constant
Given values using reference [1]:
KT = .019
Ra =0.5000
Jm =7.0600e-06
Dm = 3.56*10^-6
KE = .019
Kp = 1
R1 = 5.0800 cm
Table 1: Motor Model & Specs: 130 Motor
Size: Long 55MM W 48.3MM high-23MM
Gear ratio: 1:120 • 1:120
 No-load speed (3V): 90 RPM
 No-load speed (6V): 180 RPM
 No load current (3V): 120 mA
 No load current (6V): 160 mA
 Locked-rotor current (3V): 1.5 A
 Locked-rotor current (6V): 2.8 A
 Output Mode: Two way shaft output
IV. TechnicalProblem
Introduction
The project that has been chosen is to be an example of a three-degree of freedom (3 DOF) system is a
four-wheel drive car programmed using an Arduino board. This car is to autonomously travel in the
forward, backward,and left and right directions. More specifically, the car will be programmed to track a
colored RC car using the CMUcam4,at a certain distance measured by a ping sensor , as the object moves
around the room. As the car traverses,it will use the ping sensor for object avoidance and when the
actual color is tracked will stop. Once the color is no longer detected the car will resume travel and object
avoidance until the color is tracked again. User defined parameters will be programmed to what distance
to maintain from the object, and what color object to follow.
Requirements
Table 2: requirements T.O.M 3.0
Requirement Description
3 Degrees of Freedom (coupled) The direction and speed of each independent
motor (4), and the rotation of the ping sensor
using a servo motor
Sensors Parallax Ultrasonic Ping Sensor and
CMUcam4 to measure distance and color
Set-off Tracking Distance Programmed to avoid objects at a 50 cm
distance
Track a Color The CMUcam4 tracks the color red within a
Pixel and Confidence threshold of 50
PID and Logic Control The car’s position, velocity, and tracking
capabilities will be programmed using a
mixture of PID and Logic control
Materials and SensorDescription
A mobile platform with 4 independent motors will be used, shown in Figure1 .
The CMUcam4 is a color sensitive camera that will detect a color within a certain confidence level,
shown in Figure 7
The Parallax Ping Sensor used is an ultrasonic range finder that continuously detects and calculates the
distance of obstacles in the path of the vehicle via sound signals shown in Figure 8.
Two Arduino Mega 2560 is used for the programming of the motor and ping sensor mounted on the
vehicle shown in Figure 9 and 10. A “Master and Slave” system was used to communicate the two
boards which will be discussed in the Integration section.
A SainSmart L293D Motor Drive Shield acting as a dual full-bridge driver to allow control over speed
and direction of the motor used in the car,and ability to integrate into the programming of the ping
sensor, ping servo motor, and the CMUcam4.
Pixel and Confidence Threshold Description
The percentage of the number of pixels tracked in the color-tracking window ranging from 0 to 255.
Where 0 represents 0% and 255 represents 100%. 0 is returned if and only if no pixels are tracked at all
within the color-tracking window. Confidence is the percentage of the number of pixels tracked in the
bounding box ranging from 0 to 255. Where 0 represents 0% and 255 represents 100%. 0 is returned if
and only if no pixels are tracked at all within the bounding box. In our case having a threshold of 50 is
only a 19% level of confidence for the tracking window.
B. Competing Constraints / Trade-offs
System constraints for T.O.M 3.0 are defined by the type of sensors used to control the object, the sonar
based ping sensor. Implementing a single derivative component the velocity error of the system can be
controlled. Using this position sensor the speed of each of the four wheels can be controlled to either slow
or stop the motors as you approach the desired position. Other constraints lie within the CMUcam4
because of its field of view; the camera is setup to read the x position of an object between 0 and 160
pixels. If the object moves out of range the CMUcam4 is free to lock on to any other object that meets its
color requirements. Between the CMUcam4 and the ping sensor some constraints that stop the vehicle can
compete with each other, for example if the red object is within the ping distance, but out of the range of
the CMUcam4 the car will start searching for an exit because it thinks it has reached a wall.
The CMUcam4 has the ability to sense the distance of an object in the Z direction, if programmed
correctly the CMUcam4 could simultaneously keep track of a color and position of the vehicle. The
problem with this is that you would have to specify the color of objects that you would like to avoid.
Likewise as previously mentioned, the use of two sensors can cause them to compete with each other in
the coding. A large trade off for using two sensors is the power draw for each. The CMUcam4 draws a lot
of current and has a high voltage requirement. If the CMUcam4 and the ping sensor are running on the
same source when both are active they will experience a large delay because of the power draw. This
requires both sensors to have their own power source,which can lead to weight issues.
TOM 3.0
V. DesignApproach:
The MaliciousMouse
who stole ma’ cheese
Jerry’s
escape pod
V.2 State Space
Using the equations for state space,the motor output state space equation giving consideration to
the control variables is given as:
{
𝐱̇ = [
0 1
0 −0.8828
] 𝐱 + [
0
1
] (5V)
𝑦 = [1 0] 𝐱
Equation 13-State Space
Equation
Where the control variables are output angle and output angular velocity, respectively.
𝐱 = [
𝜃𝑜(𝑡)
𝑑𝜃𝑜(𝑡)
𝑑𝑡
] Equation 14-Control Variable Vector
𝐱̇ =
[
𝑑𝜃𝑜(𝑡)
𝑑𝑡
𝑑2
𝜃𝑜(𝑡)
𝑑𝑡2 ]
Equation 15-Control Variable
Vector Derivative
Hence,this system of equations can properly describe the system's motion at any point in time,
and with any input given to it. Figure 3 displays some MATLAB™ code that can be used to solve for
state-space response with this set of equations. Figure 3 shows the calculated response. While the code
and output was unsuccessful, the use of the ss function and step(ss) command is the general approach to
computationally solve for state-space problems in a matrix-based program.
V.2 Mathematical Model for 2 and 4 WD
Assuming that this is a two wheel bicycle kinematic model no slip of the wheels, meaning there
is no y-velocity component:
𝜃̇ =
𝑣
𝑅1
𝑅1 =
𝐿
tan 𝛼
𝑅2 > 𝑅1
𝛾𝐿 = 𝛾𝑅
For the world frame:
𝑥̇ = 𝑣 cos 𝜃
𝑦̇ = 𝑣 sin 𝜃
𝜃̇ = (
𝑣
𝐿
)tan 𝛾
𝑦̇ cos 𝜃 − 𝑥̇ sin 𝜃 ≠ 0
The above relationship cannot be integrated to form a relationship between x, y and 𝜃.
Therefore, Simulink was used to model the system. While moving to a goal point (𝑥∗
, 𝑦∗
), the
velocity is controlled proportionally to its distance from the goal:
𝑣∗
= 𝑘 𝑣√(𝑥∗ − 𝑥)2 + ( 𝑦∗ − 𝑦)2
𝜃∗
= tan−1
(
𝑦∗
− 𝑦
𝑥∗ − 𝑥
)
Using a proportional controller:
𝛾 = 𝑘ℎ( 𝜃∗
∝ 𝜃), 𝑘ℎ > 0
For a 4 wheel drive, the turning radius is just the length of the car. This in affect is due to
individual control of all 4 wheels (4WD). By applying the method show below for the turning of
each wheel allows the bot to have turning radius of the length of the car.
Table 3:Power Budget
Component Volt mWatt mAmp #
Ping Sensor 5 150 30 1
Servo 5 700 140 1
CMUcam4 9 215 120 1
4 Motors (no load current) 3.4 21.9 120 1
Arduino (Master) 9 2000 400 1
Arduino (Slave) 5 2000 400 1
Total 36 3086.9 810 9
Battery Capacity (mAh) 1400
Estimated Runtime (h) .68
Estimated Runtime (min) 34
Discussion of Camera
For this project a camera sensor, CMUcam4,was programmed through the “Master” Arduino
microcontroller to track a particular color. The color space chosen was the RGB mode, and minimum and
maximum values were specified for each hue in the space to track the color red.
RGB, color space parameterizes the ranges of the primary colors red, green, and blue (Figure 19). In
other words, all colors are defined by these three primary colors in different amounts and ratios.
Additional confidence and pixel tolerances were also given to further indicate when the camera tracks the
colored object in the viewing window. The confidence level is the amount of pixels in the tracking area-
pixel density. Additionally a filter for noise was added.
When the car is powered, the CMUcam4 will start its initialization protocol by sensing and adjusting to
the level of light in its surrounding environment using the Auto-gain and Auto White-balance. Once the
specified auto-adjustments is complete, the CMUcam4 then begins receiving and transmitting the tracking
data to the Arudino through the TXO and RXI pins , and transmitting the collected color tracking data
through the SCL and SDA pins (Clock and Data pins) to the receiving “Slave” Arduino.
The car is programmed to stop the motors when the CMUcam4 detects an object of the tracking color
within the confidence tolerance, and resume the driving states when an object is no longer detected.
As a confirmation for proper CMUcam4 tracking, an LED was attached to the Arduino. Based on the
same color tracking criteria, if the color was being tracked, the LED would turn on.
V.3 Integration
Based on a global and current loop iteration time
The entire program is timed using different variables to indicate an overall global time tracker using the
built-in Arduino function millis() and a previous global time tracker for each iteration time. The global
time is reset after each loop.
Ping Sensor
The ping sensor is programmed around a specified “set-off distance” of 50 cm from an object (obstacle),
and it is programmed to, if an object sensed, to perform a proximity scan in order to determine the new
path direction. This accomplished by having the ping sensor mounted on a servo motor to swivel the ping
sensor in a range of 90o
, a 45o
angular displacement left and right. A portion of the Proximity Check
Function created is shown below
Modes and states for the motors
The car’s motion is controlled by four independent motors, one for each wheel. The speed and direction
of the motors are based on different mode.
Mode Number Action
1 Drive
2 Proximity Check
3 Turning
Within each of these modes the driving direction is specified by different states. The modes and states of
the motors are if-loop cases dependent on the ping sensor distance detected and the set-off distance.
State Driving Direction
0 Stop
1 Straight
2 Right
3 Left
In Mode 1, the motors will continue to drive forward if the ping sensor detects an object greater than the
set-off distance, and if not already in state 1, to change to state 1 in order to drive straight forward.
If the distance sensed by the Ping sensor is less than the set-off distance, then the motors will change to
state 0 (if not already in state 0) which stops the motors in the car,and switch to Mode 2.
Mode 2 is a proximity check in which after scanning, depending on the left and right distance detected by
the ping sensor will change to Mode 3 and the motor state to State 2 or 3 in order to avoid the obstacle. In
other words, Mode 3 is turning the car until the time that the car is turning equals the specified turn length
time of 750 ms.
Arduino to Arduino I 2C communication
As previously mentioned, two Arduino boards were used in a “Master and Slave” system. This system
uses I 2
C communication and the Wire Library to transmit data between the boards with one board being
the Master,writing the data, and the second board being the Slave receiving the data. This
communication requires the two boards to connect to the SDA and SCL (Data and Clock lines) pins, 20
and 21 respectively (Figure 10), on each of the boards together as well as having common power and
ground lines. For the purpose of this project, the CMUcam4 was connected to the Master board. The
color tracking data was processed through this board and sent the color tracking information to the Slave.
The Slave Arduino board received this data and integrated the data with all 4 motors, the ping sensor, and
the servo motor to rotate the ping sensor.
In order to track the mouse, the car must be able to follow and turn with the mouse. This is once again
determined by integrating the ping sensor and the CMUcam4. As mentioned previously, the ping sensor
and motors are integrated to control the velocity of the motors for object avoidance. In addition to this,
depending on the RGB min and max value ranges and the confidence level established, the CMUcam4
will signal the start the tracking mode of the car if both distance and color criteria are met. In order to
follow and turn with the mouse, each of the motors’ speed will be adjusted based on the colored object’s
centroid data from the CMUcam4.
The left side of the code is the camera tracking data loop from the Master Arduino board. The
communication with the Slave Arduino board is done by the command Wire.beginTransmission(2) and
Wire.endTransmission( ). The number 2 is the “address” of the Slave board. The data from the camera is
transmitted to the slave using the command Wire.write( ). In this code the track data listed in the table
below is being transmitted to the Slave Arduino board:
Variable Tracking Data Parameter
a X centroid data (mx)
b Y centroid data (my)
c Confidence
p Pixel Density
The right side of the code shows parts of the code for the Slave Arduino board. The top portion
establishes the communication with the Master board, and below it is the turning function based on the
data it receives from the camera on the Master board with the command Wire.read( ).
V.4 General Assembly of the circuit shown below
Figure 1 Wiring Diagram of complete System
V.5 Simulink Model
Figure 2 Simulink Kinematic Model
Figure 3 Model of Subsystem
VI. DiscussionofData
VI.1 Discussionof Results of the Simulink model
The gains for the Simulink model in Figure 2 were set to Kv=0.1, Kb=0.2 and Kh=15. For
Figure 3, the velocity limit is set to 6 mph, the acceleration limit is set to 3 ft/s^2, and the
steering angle limit is set to 20 degrees. In order to test if the Simulink model was functioning
properly, the desired x-position and y-position were both set equal to 0. Therefore, the car
should not move. This result is confirmed in Figure 4. The velocity should also remain at 0,
which is confirmed in Figure 4. In Figure 6, the desired x-position is set to 5 and desired y-
position is set to 5. As expected, x approaches and remains at a value of 5 while y approaches
and remains at a value of 5 as well. Figure 6 shows that the velocity decreases as the car
approaches its target position of x=5 and y=5. This Simulink model was also used to track the
position and velocity of the car relative to the time at different desired positions. Figures 4-7
show the position and velocity in each of these circumstances.
My Simulink
model doesnot
compare to Real
time simulation.
Figure 4 X and Y Graphs for a Goal of x=0 and y=0
Figure 5 Velocity Graph for a Goal of x=0 and y=0
Figure 6 X and Y Graphs for a Goal of x=5 and y=5
VI.2 Discussion of Data
The constructed PD controller manipulated motor speed based off a calculated error in both
position and velocity. As shown in the code1
, a distance of 50 cm was chosen as the target distance, and
the PD controller continuously calculates a position error defined as the target distance minus the actual
vehicle distance. The velocity error of the vehicle is defined as the position error divided by the time
between each ping measurement.
VI.3 Simulink Model
A closed negative feedback loop was created in Simulink to model the control system of the car
(Fig 4). In this model the friction between the motor and the gears is taken into account, but the friction
between the wheels and ground has been neglected so that it is an “ideal” system verses the experimental.
The desired position of the car and the actual physical location of the car were the inputs and
outputs of the system, respectively, with the position difference as the error. A step-impulse signal was
used as the input signal, the controller was modeled as a PD controller, the motor was modeled by a
transfer function derived from Equations 3-12, and the gear ratio constant and integrator of the system are
combined in order to transform the output angular velocity of the motor into the output position of the car.
The ping sensor is then modeled as a constant in the feedback of the system to indicate the error
difference from the measured output.
Using the PID tuner in Simulink, the PD controller was found to have the following results: The
gains of the proportionality constant Kp was found to be 0.002 and the derivative constant Kd was found to
be 1.2. The gains of the system resulted in the reaction time to be 2.23e-6 seconds and the settling time to
be between 5 and 6 seconds under ideal conditions (Fig.5).
Experimental
The first iteration of the control system gains were based solely off the model made in Simulink,
a Kp of 0.002 and Kd of 1.2. The model suggests either using a very low proportional gain, or even
removing it completely. This was tested in our model and resulted in a large settling time because the
vehicle would over shoot the target distance on each correction. The Simulink simulation models the
motor as a perfect system having no wind up time or changes to engine performance due to heat. The
shear amount of losses due to nature cannot be modeled in most computer systems; this is why many
gains have to be tuned after an initial model.
Tuning the system revealed that if the proportional gain was less than 1 the vehicle would not
start because proportional error is roughly zero and the velocity error would be very low, which causes
the motor to stop. Gains were tested near the Simulink values and the time for the system to settle was
measured. From these tests a Kp gain of 1 and a Kd gain of 2 was found to have the lowest overshoot and
a settling time of 5-6 seconds, which was predicted in the Simulink model. During each test the vehicle
consistently stopped 55 cm with small errors due to the ping sensor sensitivity. The Arduino code with
the final proportionality gains used to program the car is shown in Figure 2.
1 Figure 5 & 6
VII. Conclusions
In conclusion, the Arduino coded RC car was successfully programmed to reach a target distance
starting from any distance in front of or behind the target distance within a detectable range of the ping
sensor. The autonomously controlled car programmed with the PD controller is consistent with the
theoretical results simulated and modeled on Simulink. The Simulink model proportional gain is
negligible, but when the model is exposed to real and practical environments, a small proportional gain of
the controller becomes necessary to reach the target distance within the same settling time as the
theoretical. The results between the theoretical and experimental were by a small factor. The difference
in gains is caused by various factors that were not considered in the theoretical model but are present in
the experimental. The theoretical Simulink model does not take into account the different friction factors
between the wheels and the floor that actual car experiences. The theoreticalalso does not have the
additional mass of the battery, platform, and circuit boards that the car has to carry. Overall, the
programmed car behaved as expected and is comparable to the Simulink model. Listed below are trails
run with different Kd and Kp with the same start distance and motor speed.
Technical difficulties were experienced while programming the motion control and object
avoidance to be integrated to the CMUcam4 color tracking detection. The problem encountered was the
motors, ping sensor, and servo motor for the ping sensor would properly work as a system and the
CMUcam4 functioned properly as a system by itself, but when trying to integrate the two systems the
performance of the object avoidance system would no longer function properly. There was an issue with
the power distribution to the board and sensors. When attempting to supply an independent voltage
source to the CMUcam4 a loss of function occurred; this was due to an improperly functioning board and
as a result was issued a new camera. Using the new camera with the independent power source and
Arduino board with a common grounding to the other system of the car resolved the issue. Additionally
throughout the trials of testing code, the ping sensor experienced multiple collisions to the wall and other
objects which as a result increase the possibility of decreased performance of the ping sensor causing
possible error in the distance readings. Lastly, there were difficulties calibrating the CMUcam4 to the
lighting environment inside the course area. While it tracked red objects in other areas,while in the court,
it was detecting its own reflection off the white walls and sometimes tracking the red lines on the ground,
making it difficult to track the actual target.
The final code coupling the three degrees of freedom using all motors and sensors was successful
and functions as intended, meeting all requirements stated above. The car,after initializing for five
seconds will drive forward and avoid objects, and once the CMUcam4 tracks the color red within the
confidence level, track the red colored object. The Proximity Check, forward driving, and object
avoidance resumes once the color is no longer detected. This process is run autonomously and will
continue until it power is no longer supplied.
Table 4: Trial Run For Ping sensor with CMUcam4
Different values of Kp & Kd Settling time to reach steady state (seconds)
Kp = .5 & Kd = 1.5 (Theoretical values) The motor did not run
Kp = 1 & Kd = 2 1.6
Kp = 2 & Kd = 4 2.8
VIII. Appendix
X.1 List of Figures
Figure 7 CMUcam4 4
Figure 8 Parallax Ultrasonic Ping Sensor
Figure 9 Arduino Mega 2560
Figure 10 I2C Communication Arduino
Schematic
Figure 11 Wiring and Power Supplies Configuration
Figure 12 Wiring Configuration of the Arduino Board and Motor Shield
Figure 14 State-flow Chart in Simulink
Figure 13 State Transition Diagram
Figure 15 - MatLab model of step response
Figure 16 PD Controller (Simulink model)
Figure 17 Simulink Auto-tune PD Control with Step Response
Figure 18 CMUcam4 Confidence Threshold
Figure 20 YUV ColorSpace
Figure 19 RGB color space
Reference
Reference [1] – Pittman Motors. Chapter 8
T.O.M 3.0 (Final PRINT)

More Related Content

Similar to T.O.M 3.0 (Final PRINT)

Vehicle to Vehicle Communication using Bluetooth and GPS.
Vehicle to Vehicle Communication using Bluetooth and GPS.Vehicle to Vehicle Communication using Bluetooth and GPS.
Vehicle to Vehicle Communication using Bluetooth and GPS.
Mayur Wadekar
 
LC_Thesis_Final (1).pdf
LC_Thesis_Final (1).pdfLC_Thesis_Final (1).pdf
LC_Thesis_Final (1).pdf
Summrina Kanwal
 
Master_Thesis_Jiaqi_Liu
Master_Thesis_Jiaqi_LiuMaster_Thesis_Jiaqi_Liu
Master_Thesis_Jiaqi_LiuJiaqi Liu
 
Project report on Eye tracking interpretation system
Project report on Eye tracking interpretation systemProject report on Eye tracking interpretation system
Project report on Eye tracking interpretation system
kurkute1994
 
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
TanuAgrawal27
 
Machine Learning Project - Neural Network
Machine Learning Project - Neural Network Machine Learning Project - Neural Network
Machine Learning Project - Neural Network
HamdaAnees
 
Innovative Payloads for Small Unmanned Aerial System-Based Person
Innovative Payloads for Small Unmanned Aerial System-Based PersonInnovative Payloads for Small Unmanned Aerial System-Based Person
Innovative Payloads for Small Unmanned Aerial System-Based PersonAustin Jensen
 
MACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEMACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEbutest
 
MACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEMACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEbutest
 
Project Report Distance measurement system
Project Report Distance measurement systemProject Report Distance measurement system
Project Report Distance measurement system
kurkute1994
 
TFG_Cristobal_Cuevas_Garcia_2018.pdf
TFG_Cristobal_Cuevas_Garcia_2018.pdfTFG_Cristobal_Cuevas_Garcia_2018.pdf
TFG_Cristobal_Cuevas_Garcia_2018.pdf
Gerard Labernia
 
(Ab)using Smart Cities - Whitepaper
(Ab)using Smart Cities - Whitepaper(Ab)using Smart Cities - Whitepaper
(Ab)using Smart Cities - Whitepaper
Opposing Force S.r.l.
 
AWS Pentesting
AWS PentestingAWS Pentesting
AWS Pentesting
MichaelRodriguesdosS1
 
Obstacle detection in images
Obstacle detection in imagesObstacle detection in images
Obstacle detection in imageshasangamethmal
 

Similar to T.O.M 3.0 (Final PRINT) (20)

Vehicle to Vehicle Communication using Bluetooth and GPS.
Vehicle to Vehicle Communication using Bluetooth and GPS.Vehicle to Vehicle Communication using Bluetooth and GPS.
Vehicle to Vehicle Communication using Bluetooth and GPS.
 
LC_Thesis_Final (1).pdf
LC_Thesis_Final (1).pdfLC_Thesis_Final (1).pdf
LC_Thesis_Final (1).pdf
 
Master_Thesis_Jiaqi_Liu
Master_Thesis_Jiaqi_LiuMaster_Thesis_Jiaqi_Liu
Master_Thesis_Jiaqi_Liu
 
Thesis
ThesisThesis
Thesis
 
Project report on Eye tracking interpretation system
Project report on Eye tracking interpretation systemProject report on Eye tracking interpretation system
Project report on Eye tracking interpretation system
 
978-3-659-82929-1
978-3-659-82929-1978-3-659-82929-1
978-3-659-82929-1
 
report
reportreport
report
 
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
Smart Traffic Management System using Internet of Things (IoT)-btech-cse-04-0...
 
Machine Learning Project - Neural Network
Machine Learning Project - Neural Network Machine Learning Project - Neural Network
Machine Learning Project - Neural Network
 
Innovative Payloads for Small Unmanned Aerial System-Based Person
Innovative Payloads for Small Unmanned Aerial System-Based PersonInnovative Payloads for Small Unmanned Aerial System-Based Person
Innovative Payloads for Small Unmanned Aerial System-Based Person
 
MACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEMACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THE
 
MACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THEMACHINE LEARNING METHODS FOR THE
MACHINE LEARNING METHODS FOR THE
 
Project Report Distance measurement system
Project Report Distance measurement systemProject Report Distance measurement system
Project Report Distance measurement system
 
TFG_Cristobal_Cuevas_Garcia_2018.pdf
TFG_Cristobal_Cuevas_Garcia_2018.pdfTFG_Cristobal_Cuevas_Garcia_2018.pdf
TFG_Cristobal_Cuevas_Garcia_2018.pdf
 
thesis
thesisthesis
thesis
 
thesis-2
thesis-2thesis-2
thesis-2
 
project(copy1)
project(copy1)project(copy1)
project(copy1)
 
(Ab)using Smart Cities - Whitepaper
(Ab)using Smart Cities - Whitepaper(Ab)using Smart Cities - Whitepaper
(Ab)using Smart Cities - Whitepaper
 
AWS Pentesting
AWS PentestingAWS Pentesting
AWS Pentesting
 
Obstacle detection in images
Obstacle detection in imagesObstacle detection in images
Obstacle detection in images
 

T.O.M 3.0 (Final PRINT)

  • 1. T.O.M 3.0 the Tom who’s finally going to catch Jerry! By: Team 1 Amit Bhakta Kayla Lovelady Michael Bruins For ME 4543.002 Mechatronics (Lab) University of Texas at San Antonio College of Engineering San Antonio, Texas Assignment Date April 5th, 2013 Assignment Due Date May 7th, 2013 Instructor Brent M. Nowak, Ph.D., Associate Professor Mechanical Engineering Department
  • 2. II. Executive Summary An autonomous vehicle (AV) is a vehicle that is able to operate independent of human interaction. It uses sensors to examine its environment, and the AV controlled through feedback from those sensors. The challenge that has been presented to our group is to design, build and test an autonomous vehicle, which will locate and track down a target (RC car) with red color. The project has two parts, tracking a non-moving red mouse and tracking a moving red mouse (in form of RC car). The 4 wheel AV will drive through a court and avoid obstacles as it attempts to catch the dead and living mouse scenarios. The goal for the first part of the project was to determine the safe distance for our ping sensor to operate for the AV to avoid objects and walls. The distance between objects is determined by the use of Paralax Ping Sensor, which acts as an ultrasonic distance sensor. The sensor emits and recives an ultrasoics signal which is defind by the pulse rate set by the user. That pulse rate then is convered to distance from the return signal that is recived by the sensor. From random stationary position, even if the bot is enclosed by 3 walls with small gap the bot should be able to find its way out by rotation the sensor and go into “Proximity check mode.” In this mode, the AV will be searching for a path to travel without obstructions and simultaneously scanning for objects with the color red to track. The second goal of the project was to track an object, the mouse (RC car), based on the color red with the CMUcam4. The CMUcam4 vision sensor will be used by the cat in order to locate the object with red color. It will accomplish this through the use of embeded camera module that will search for the the color that will be given certin MIN and MAX value of the color. The camera operates by identifying its frequency, which is unique to each color. The sensors will be connected to an Arduino Mega 2560, which is an open source platform that can receive input from varying types of sensors. Once the input has been received, the 2560 will interpret the data and assign a corresponding output to one or all four of the motors. The input/output dynamic is accomplished by an embedded microprocessor on the 2560. It will be governed by a user defined, group member, set of instructions, code, which will be written in a C based programming language. Proior to construction of the physical model, the AV was modled in MATLAB simulink as a feedback control system by ulitzing a mathematical model. Using this analysis helps us determine and describes its movement. Once the AV was constructed, the code was to be generated and compiled to meet specifications. The obstacle maneuvering code was first to be written and compiled. This was tested in the Mechatroics lab and fine tuned untill optimal performace was achieved. The code for ping with servo mount and CMUcam4 were tested individually then all codes were integrated and tested.
  • 3. Table of Contents III. Introduction ............................................................................................................................5 Subsystem 1:..................................................................................................................................6 Subsystem 2:..................................................................................................................................6 Subsystem 3:..................................................................................................................................6 Figure 1: DC motor attached to the gears .......................................................................................... 6 V.2 State Space.............................................................................................................................12 V.2 Mathematical Model for 2 and 4 WD.......................................................................................13 Discussion of Camera .......................................................................................................................16 V.3 Integration .............................................................................................................................17 Based on a global and current loop iteration time.............................................................................. 17 Ping Sensor..................................................................................................................................... 17 Modes and states for the motors ....................................................................................................... 18 Arduino to Arduino I 2 C communication............................................................................................ 19 V.5 Simulink Model.....................................................................................................................23 VI. Discussion of Data......................................................................................................................25 VI.1 Discussion of Results of the Simulink model...........................................................................25 VI.2 Discussion of Data.............................................................................................................29 VI.3 Simulink Model ....................................................................................................................29 Experimental................................................................................................................................29 VII. Conclusions ..........................................................................................................................30 Table 4: Trial Run For Ping sensor with CMUcam4........................................................................31 VIII. Appendix ..............................................................................................................................31 X.1 List of Figures ...................................................................................................................31 Figure 15 - MatLab model of step response....................................................................................... 36 Figure 16 PD Controller (Simulink model)........................................................................................ 36 Figure 17 Simulink Auto-tune PD Control with Step Response............................................................ 37 Reference.........................................................................................................................................39 Reference [1] – Pittman Motors. Chapter 8.....................................................................................39
  • 4. Figure 1: DC motor attached to the gears.............................................................................................6 Figure 2 Simulink Kinematic Model..................................................................................................23 Figure 3 Model of Subsystem............................................................................................................24 Figure 4 X and Y Graphs for a Goal of x=0 and y=0 ...........................................................................26 Figure 5 Velocity Graph for a Goal of x=0 and y=0 ............................................................................27 Figure 6 X and Y Graphs for a Goal of x=5 and y=5 ...........................................................................28 Figure 7 CMUcam4 4 .......................................................................................................................31 Figure 9 Arduino Mega 2560.............................................................................................................32 Figure 8 Parallax Ultrasonic Ping Sensor............................................................................................31 Figure 10 I2C Communication Arduino Schematic .............................................................................32 Figure 11 Wiring and Power Supplies Configuration...........................................................................33 Figure 12 Wiring Configuration of the Arduino Board and Motor Shield..............................................33 Figure 13 State Transition Diagram ...................................................................................................35 Figure 14 State-flow Chart in Simulink ..............................................................................................35 Figure 15 - MatLab model of step response........................................................................................36 Figure 16 PD Controller (Simulink model) .........................................................................................36 Figure 17 Simulink Auto-tune PD Control with Step Response............................................................37 Figure 18 CMUcam4 Confidence Threshold.......................................................................................37 Figure 19 RGB color space ...............................................................................................................38 Figure 20 YUV ColorSpace...............................................................................................................38 Table 1: Motor Model & Specs: 130 Motor ..........................................................................................8 Table 2: requirements T.O.M 3.0.........................................................................................................9 Table 3:Power Budget.......................................................................................................................15 Table 4: Trial Run For Ping sensor with CMUcam4............................................................................31
  • 5. III. Introduction The control system implemented for this project was to control a coupled three degree-of- freedom (3DOF) motion system. For this project we have chosen to manipulate the motors’ speed in forward and backward directions on a mobile platform, a Parallax Ultrasonic Ping Sensor mounted on a servo to scan left and right, and a CMUcam4 to detect a color. The Parallax Ultrasonic Ping Sensor to measure distance to any object in its sight range. Using the position measured from the ping sensor, it controls the speed and direction of the motors in order to avoid objects. The servo motor that the ping sensor is attached to is also controlled by the ping sensor. Depending on the distance of an object, the servo motor is capable of staying centered and rotating 45o left and right from the centered position to scan the area around it. The CMUcam4 simultaneously detects the input color within a specified pixel density threshold and confidence. Based on the detection of the color, the motor platform will then transition into “tracking mode”, and follow the detected colored object. For the dead mouse, the AV will run into the non-moving RC car,and for the moving mouse the AV will follow it then finally ram into the mouse at a higher speed. PD controllers in a position and velocity based system are used to manipulate velocity and position error via the use of two independent gains, Kv and Kp (Eq 1 and Eq2), respectively. Simply put, the position error of the plant is the difference between the current position and the target position. The derivative error, or velocity error, is defined as the position error over the loop time. One thing to remember with these definitions is that if the vehicle over shoots the distance, there will be a negative error. With most motors this will cause problems because the system does not recognize a negative motor speed, just a rotational direction. To absolve these errors a series of if statements were implemented to change the motor direction based on the positive or negative state of the calculated error. As previously stated the motor velocity is the part of the plant that is to be manipulated. Simply leaving the motor speed constant would result in an un-damped system that would oscillate forward and backward over the target distance. The controller in this system adds an artificial dampening that will slow the motor as a function of the measured error in distance. To do this, a ratio of the total error, sum of the proportional and derivate error,and target distance changes the motor speed. If the error is measured as zero, the motor will receive and input of zero as its speed, effectively stopping the vehicle. Likewise if the error is larger, the vehicle will accelerate until it comes in range of an object. GreenLED will indicate that the target has beenacquired
  • 6. Subsystem 1: Input/Output - Position. These share a transfer function in form of gain, Kpot which is the ratio between Θ𝑖( 𝑠),input angles, to a voltage 𝑉𝑖( 𝑠). Where Θ𝑖( 𝑠) = 𝑛𝜋 and 𝑉𝑖( 𝑠) = 𝑉. 𝑉𝑖( 𝑠) Θ𝑖( 𝑠) = 𝐾p Equation 1-Position Gain Subsystem 2: Arduino constant gain K. This takes an input signal voltage 𝑉𝑒( 𝑠)and outputs a signal voltage 𝑉𝐴( 𝑠) for the DC motor to use. 𝑉𝐴 ( 𝑠) 𝑉𝑒( 𝑠) = 𝐾 Equation 2-Arduino Constant Gain Subsystem 3: DC Motor attached to the Gears and Load (Antenna). In order to analyze the complete subsystem, it’s necessary to do it from the inside out using KVL related to input voltage to the motor and from the motor to the position of the armature. Below is the schematic of the motor. Figure 1: DC motor attached to the gears KVL analysis resulted in: 𝑅 𝑎 𝐼𝑎(𝑠) + 𝐿 𝑎 𝑠𝐼𝑎(𝑠) + 𝑉𝑏(𝑠) = 𝑉𝐴 (𝑠) Equation 3-KVL
  • 7. The back-emf is proportional to the angular velocity of the motor’s rotor; Kb is the back-emf constant. 𝑉𝑏 = 𝐾𝑏 𝑠𝜃 𝑚 (𝑠) Equation 4-Motor Velocity The torque developed is a function of the motor-torque constant Kt and the armature current 𝐼 𝑎(𝑠): 𝑇 𝑚( 𝑠) = 𝐾𝑡 𝐼𝑎(𝑠) Equation 5-Motor Torque The effective inertia load (Jm) on the motor is: 𝐽 𝑚 = 𝐽𝑎 + 𝐽𝐿 ( 𝑁1 𝑁2 ) 2 Equation 6-Effective Inertia The effective damping (Dm) is: 𝐷 𝑚 = 𝐷 𝑎 + 𝐷 𝐿 ( 𝑁1 𝑁2 ) 2 Equation 7-Effective Damping Using Newton’s second law for rotation: 𝑇 𝑚( 𝑠) = 𝐽 𝑚 𝜃 𝑚 ( 𝑠) 𝑠2 + 𝐷 𝑚 𝜃( 𝑠) 𝑠 Equation 8 Angular displacement and input voltage 𝑉𝑎( 𝑠) is: 𝛩 𝑚(𝑠) 𝑉𝑎(𝑠) = 𝐾𝑡 𝑅 𝑎 𝐽 𝑚 [𝑠2 + 𝑠 { 1 𝐽 𝑚 (𝐷 𝑚 + 𝐾𝑡 𝐾𝑏 𝑅 𝑎 )}] Equation 9-Angular Displacement over input voltage The back-emf constant is: 𝐾𝑒 = 𝑉𝑎 𝜔 𝑛𝑜𝑙𝑜𝑎 𝑑 Equation 10-Back emf Constant The torque constant Kt is: 𝐾𝑡 = 𝑇 𝑚 𝐼𝑎 Equation 11-Torque Constant
  • 8. The gear ratio constant, R1 is: 𝑅1 = 𝑁1 𝑁2 Equation 12-Gear Ratio Constant Given values using reference [1]: KT = .019 Ra =0.5000 Jm =7.0600e-06 Dm = 3.56*10^-6 KE = .019 Kp = 1 R1 = 5.0800 cm Table 1: Motor Model & Specs: 130 Motor Size: Long 55MM W 48.3MM high-23MM Gear ratio: 1:120 • 1:120  No-load speed (3V): 90 RPM  No-load speed (6V): 180 RPM  No load current (3V): 120 mA  No load current (6V): 160 mA  Locked-rotor current (3V): 1.5 A  Locked-rotor current (6V): 2.8 A  Output Mode: Two way shaft output
  • 9. IV. TechnicalProblem Introduction The project that has been chosen is to be an example of a three-degree of freedom (3 DOF) system is a four-wheel drive car programmed using an Arduino board. This car is to autonomously travel in the forward, backward,and left and right directions. More specifically, the car will be programmed to track a colored RC car using the CMUcam4,at a certain distance measured by a ping sensor , as the object moves around the room. As the car traverses,it will use the ping sensor for object avoidance and when the actual color is tracked will stop. Once the color is no longer detected the car will resume travel and object avoidance until the color is tracked again. User defined parameters will be programmed to what distance to maintain from the object, and what color object to follow. Requirements Table 2: requirements T.O.M 3.0 Requirement Description 3 Degrees of Freedom (coupled) The direction and speed of each independent motor (4), and the rotation of the ping sensor using a servo motor Sensors Parallax Ultrasonic Ping Sensor and CMUcam4 to measure distance and color Set-off Tracking Distance Programmed to avoid objects at a 50 cm distance Track a Color The CMUcam4 tracks the color red within a Pixel and Confidence threshold of 50 PID and Logic Control The car’s position, velocity, and tracking capabilities will be programmed using a mixture of PID and Logic control Materials and SensorDescription A mobile platform with 4 independent motors will be used, shown in Figure1 . The CMUcam4 is a color sensitive camera that will detect a color within a certain confidence level, shown in Figure 7 The Parallax Ping Sensor used is an ultrasonic range finder that continuously detects and calculates the distance of obstacles in the path of the vehicle via sound signals shown in Figure 8. Two Arduino Mega 2560 is used for the programming of the motor and ping sensor mounted on the vehicle shown in Figure 9 and 10. A “Master and Slave” system was used to communicate the two boards which will be discussed in the Integration section.
  • 10. A SainSmart L293D Motor Drive Shield acting as a dual full-bridge driver to allow control over speed and direction of the motor used in the car,and ability to integrate into the programming of the ping sensor, ping servo motor, and the CMUcam4. Pixel and Confidence Threshold Description The percentage of the number of pixels tracked in the color-tracking window ranging from 0 to 255. Where 0 represents 0% and 255 represents 100%. 0 is returned if and only if no pixels are tracked at all within the color-tracking window. Confidence is the percentage of the number of pixels tracked in the bounding box ranging from 0 to 255. Where 0 represents 0% and 255 represents 100%. 0 is returned if and only if no pixels are tracked at all within the bounding box. In our case having a threshold of 50 is only a 19% level of confidence for the tracking window. B. Competing Constraints / Trade-offs System constraints for T.O.M 3.0 are defined by the type of sensors used to control the object, the sonar based ping sensor. Implementing a single derivative component the velocity error of the system can be controlled. Using this position sensor the speed of each of the four wheels can be controlled to either slow or stop the motors as you approach the desired position. Other constraints lie within the CMUcam4 because of its field of view; the camera is setup to read the x position of an object between 0 and 160 pixels. If the object moves out of range the CMUcam4 is free to lock on to any other object that meets its color requirements. Between the CMUcam4 and the ping sensor some constraints that stop the vehicle can compete with each other, for example if the red object is within the ping distance, but out of the range of the CMUcam4 the car will start searching for an exit because it thinks it has reached a wall. The CMUcam4 has the ability to sense the distance of an object in the Z direction, if programmed correctly the CMUcam4 could simultaneously keep track of a color and position of the vehicle. The problem with this is that you would have to specify the color of objects that you would like to avoid. Likewise as previously mentioned, the use of two sensors can cause them to compete with each other in the coding. A large trade off for using two sensors is the power draw for each. The CMUcam4 draws a lot of current and has a high voltage requirement. If the CMUcam4 and the ping sensor are running on the same source when both are active they will experience a large delay because of the power draw. This requires both sensors to have their own power source,which can lead to weight issues.
  • 11. TOM 3.0 V. DesignApproach: The MaliciousMouse who stole ma’ cheese Jerry’s escape pod
  • 12. V.2 State Space Using the equations for state space,the motor output state space equation giving consideration to the control variables is given as: { 𝐱̇ = [ 0 1 0 −0.8828 ] 𝐱 + [ 0 1 ] (5V) 𝑦 = [1 0] 𝐱 Equation 13-State Space Equation Where the control variables are output angle and output angular velocity, respectively. 𝐱 = [ 𝜃𝑜(𝑡) 𝑑𝜃𝑜(𝑡) 𝑑𝑡 ] Equation 14-Control Variable Vector 𝐱̇ = [ 𝑑𝜃𝑜(𝑡) 𝑑𝑡 𝑑2 𝜃𝑜(𝑡) 𝑑𝑡2 ] Equation 15-Control Variable Vector Derivative Hence,this system of equations can properly describe the system's motion at any point in time, and with any input given to it. Figure 3 displays some MATLAB™ code that can be used to solve for state-space response with this set of equations. Figure 3 shows the calculated response. While the code and output was unsuccessful, the use of the ss function and step(ss) command is the general approach to computationally solve for state-space problems in a matrix-based program.
  • 13. V.2 Mathematical Model for 2 and 4 WD Assuming that this is a two wheel bicycle kinematic model no slip of the wheels, meaning there is no y-velocity component: 𝜃̇ = 𝑣 𝑅1 𝑅1 = 𝐿 tan 𝛼 𝑅2 > 𝑅1 𝛾𝐿 = 𝛾𝑅 For the world frame: 𝑥̇ = 𝑣 cos 𝜃 𝑦̇ = 𝑣 sin 𝜃 𝜃̇ = ( 𝑣 𝐿 )tan 𝛾 𝑦̇ cos 𝜃 − 𝑥̇ sin 𝜃 ≠ 0 The above relationship cannot be integrated to form a relationship between x, y and 𝜃. Therefore, Simulink was used to model the system. While moving to a goal point (𝑥∗ , 𝑦∗ ), the velocity is controlled proportionally to its distance from the goal:
  • 14. 𝑣∗ = 𝑘 𝑣√(𝑥∗ − 𝑥)2 + ( 𝑦∗ − 𝑦)2 𝜃∗ = tan−1 ( 𝑦∗ − 𝑦 𝑥∗ − 𝑥 ) Using a proportional controller: 𝛾 = 𝑘ℎ( 𝜃∗ ∝ 𝜃), 𝑘ℎ > 0 For a 4 wheel drive, the turning radius is just the length of the car. This in affect is due to individual control of all 4 wheels (4WD). By applying the method show below for the turning of each wheel allows the bot to have turning radius of the length of the car.
  • 15. Table 3:Power Budget Component Volt mWatt mAmp # Ping Sensor 5 150 30 1 Servo 5 700 140 1 CMUcam4 9 215 120 1 4 Motors (no load current) 3.4 21.9 120 1 Arduino (Master) 9 2000 400 1 Arduino (Slave) 5 2000 400 1 Total 36 3086.9 810 9 Battery Capacity (mAh) 1400 Estimated Runtime (h) .68 Estimated Runtime (min) 34
  • 16. Discussion of Camera For this project a camera sensor, CMUcam4,was programmed through the “Master” Arduino microcontroller to track a particular color. The color space chosen was the RGB mode, and minimum and maximum values were specified for each hue in the space to track the color red. RGB, color space parameterizes the ranges of the primary colors red, green, and blue (Figure 19). In other words, all colors are defined by these three primary colors in different amounts and ratios. Additional confidence and pixel tolerances were also given to further indicate when the camera tracks the colored object in the viewing window. The confidence level is the amount of pixels in the tracking area- pixel density. Additionally a filter for noise was added. When the car is powered, the CMUcam4 will start its initialization protocol by sensing and adjusting to the level of light in its surrounding environment using the Auto-gain and Auto White-balance. Once the specified auto-adjustments is complete, the CMUcam4 then begins receiving and transmitting the tracking data to the Arudino through the TXO and RXI pins , and transmitting the collected color tracking data through the SCL and SDA pins (Clock and Data pins) to the receiving “Slave” Arduino. The car is programmed to stop the motors when the CMUcam4 detects an object of the tracking color within the confidence tolerance, and resume the driving states when an object is no longer detected. As a confirmation for proper CMUcam4 tracking, an LED was attached to the Arduino. Based on the same color tracking criteria, if the color was being tracked, the LED would turn on.
  • 17. V.3 Integration Based on a global and current loop iteration time The entire program is timed using different variables to indicate an overall global time tracker using the built-in Arduino function millis() and a previous global time tracker for each iteration time. The global time is reset after each loop. Ping Sensor The ping sensor is programmed around a specified “set-off distance” of 50 cm from an object (obstacle), and it is programmed to, if an object sensed, to perform a proximity scan in order to determine the new path direction. This accomplished by having the ping sensor mounted on a servo motor to swivel the ping sensor in a range of 90o , a 45o angular displacement left and right. A portion of the Proximity Check Function created is shown below
  • 18. Modes and states for the motors The car’s motion is controlled by four independent motors, one for each wheel. The speed and direction of the motors are based on different mode. Mode Number Action 1 Drive 2 Proximity Check 3 Turning Within each of these modes the driving direction is specified by different states. The modes and states of the motors are if-loop cases dependent on the ping sensor distance detected and the set-off distance. State Driving Direction 0 Stop 1 Straight 2 Right 3 Left In Mode 1, the motors will continue to drive forward if the ping sensor detects an object greater than the set-off distance, and if not already in state 1, to change to state 1 in order to drive straight forward. If the distance sensed by the Ping sensor is less than the set-off distance, then the motors will change to state 0 (if not already in state 0) which stops the motors in the car,and switch to Mode 2.
  • 19. Mode 2 is a proximity check in which after scanning, depending on the left and right distance detected by the ping sensor will change to Mode 3 and the motor state to State 2 or 3 in order to avoid the obstacle. In other words, Mode 3 is turning the car until the time that the car is turning equals the specified turn length time of 750 ms. Arduino to Arduino I 2C communication As previously mentioned, two Arduino boards were used in a “Master and Slave” system. This system uses I 2 C communication and the Wire Library to transmit data between the boards with one board being the Master,writing the data, and the second board being the Slave receiving the data. This communication requires the two boards to connect to the SDA and SCL (Data and Clock lines) pins, 20 and 21 respectively (Figure 10), on each of the boards together as well as having common power and ground lines. For the purpose of this project, the CMUcam4 was connected to the Master board. The color tracking data was processed through this board and sent the color tracking information to the Slave. The Slave Arduino board received this data and integrated the data with all 4 motors, the ping sensor, and the servo motor to rotate the ping sensor. In order to track the mouse, the car must be able to follow and turn with the mouse. This is once again determined by integrating the ping sensor and the CMUcam4. As mentioned previously, the ping sensor and motors are integrated to control the velocity of the motors for object avoidance. In addition to this, depending on the RGB min and max value ranges and the confidence level established, the CMUcam4 will signal the start the tracking mode of the car if both distance and color criteria are met. In order to follow and turn with the mouse, each of the motors’ speed will be adjusted based on the colored object’s centroid data from the CMUcam4.
  • 20. The left side of the code is the camera tracking data loop from the Master Arduino board. The communication with the Slave Arduino board is done by the command Wire.beginTransmission(2) and Wire.endTransmission( ). The number 2 is the “address” of the Slave board. The data from the camera is transmitted to the slave using the command Wire.write( ). In this code the track data listed in the table below is being transmitted to the Slave Arduino board: Variable Tracking Data Parameter a X centroid data (mx) b Y centroid data (my) c Confidence p Pixel Density The right side of the code shows parts of the code for the Slave Arduino board. The top portion establishes the communication with the Master board, and below it is the turning function based on the data it receives from the camera on the Master board with the command Wire.read( ).
  • 21. V.4 General Assembly of the circuit shown below Figure 1 Wiring Diagram of complete System
  • 22.
  • 23. V.5 Simulink Model Figure 2 Simulink Kinematic Model
  • 24. Figure 3 Model of Subsystem
  • 25. VI. DiscussionofData VI.1 Discussionof Results of the Simulink model The gains for the Simulink model in Figure 2 were set to Kv=0.1, Kb=0.2 and Kh=15. For Figure 3, the velocity limit is set to 6 mph, the acceleration limit is set to 3 ft/s^2, and the steering angle limit is set to 20 degrees. In order to test if the Simulink model was functioning properly, the desired x-position and y-position were both set equal to 0. Therefore, the car should not move. This result is confirmed in Figure 4. The velocity should also remain at 0, which is confirmed in Figure 4. In Figure 6, the desired x-position is set to 5 and desired y- position is set to 5. As expected, x approaches and remains at a value of 5 while y approaches and remains at a value of 5 as well. Figure 6 shows that the velocity decreases as the car approaches its target position of x=5 and y=5. This Simulink model was also used to track the position and velocity of the car relative to the time at different desired positions. Figures 4-7 show the position and velocity in each of these circumstances. My Simulink model doesnot compare to Real time simulation.
  • 26. Figure 4 X and Y Graphs for a Goal of x=0 and y=0
  • 27. Figure 5 Velocity Graph for a Goal of x=0 and y=0
  • 28. Figure 6 X and Y Graphs for a Goal of x=5 and y=5
  • 29. VI.2 Discussion of Data The constructed PD controller manipulated motor speed based off a calculated error in both position and velocity. As shown in the code1 , a distance of 50 cm was chosen as the target distance, and the PD controller continuously calculates a position error defined as the target distance minus the actual vehicle distance. The velocity error of the vehicle is defined as the position error divided by the time between each ping measurement. VI.3 Simulink Model A closed negative feedback loop was created in Simulink to model the control system of the car (Fig 4). In this model the friction between the motor and the gears is taken into account, but the friction between the wheels and ground has been neglected so that it is an “ideal” system verses the experimental. The desired position of the car and the actual physical location of the car were the inputs and outputs of the system, respectively, with the position difference as the error. A step-impulse signal was used as the input signal, the controller was modeled as a PD controller, the motor was modeled by a transfer function derived from Equations 3-12, and the gear ratio constant and integrator of the system are combined in order to transform the output angular velocity of the motor into the output position of the car. The ping sensor is then modeled as a constant in the feedback of the system to indicate the error difference from the measured output. Using the PID tuner in Simulink, the PD controller was found to have the following results: The gains of the proportionality constant Kp was found to be 0.002 and the derivative constant Kd was found to be 1.2. The gains of the system resulted in the reaction time to be 2.23e-6 seconds and the settling time to be between 5 and 6 seconds under ideal conditions (Fig.5). Experimental The first iteration of the control system gains were based solely off the model made in Simulink, a Kp of 0.002 and Kd of 1.2. The model suggests either using a very low proportional gain, or even removing it completely. This was tested in our model and resulted in a large settling time because the vehicle would over shoot the target distance on each correction. The Simulink simulation models the motor as a perfect system having no wind up time or changes to engine performance due to heat. The shear amount of losses due to nature cannot be modeled in most computer systems; this is why many gains have to be tuned after an initial model. Tuning the system revealed that if the proportional gain was less than 1 the vehicle would not start because proportional error is roughly zero and the velocity error would be very low, which causes the motor to stop. Gains were tested near the Simulink values and the time for the system to settle was measured. From these tests a Kp gain of 1 and a Kd gain of 2 was found to have the lowest overshoot and a settling time of 5-6 seconds, which was predicted in the Simulink model. During each test the vehicle consistently stopped 55 cm with small errors due to the ping sensor sensitivity. The Arduino code with the final proportionality gains used to program the car is shown in Figure 2. 1 Figure 5 & 6
  • 30. VII. Conclusions In conclusion, the Arduino coded RC car was successfully programmed to reach a target distance starting from any distance in front of or behind the target distance within a detectable range of the ping sensor. The autonomously controlled car programmed with the PD controller is consistent with the theoretical results simulated and modeled on Simulink. The Simulink model proportional gain is negligible, but when the model is exposed to real and practical environments, a small proportional gain of the controller becomes necessary to reach the target distance within the same settling time as the theoretical. The results between the theoretical and experimental were by a small factor. The difference in gains is caused by various factors that were not considered in the theoretical model but are present in the experimental. The theoretical Simulink model does not take into account the different friction factors between the wheels and the floor that actual car experiences. The theoreticalalso does not have the additional mass of the battery, platform, and circuit boards that the car has to carry. Overall, the programmed car behaved as expected and is comparable to the Simulink model. Listed below are trails run with different Kd and Kp with the same start distance and motor speed. Technical difficulties were experienced while programming the motion control and object avoidance to be integrated to the CMUcam4 color tracking detection. The problem encountered was the motors, ping sensor, and servo motor for the ping sensor would properly work as a system and the CMUcam4 functioned properly as a system by itself, but when trying to integrate the two systems the performance of the object avoidance system would no longer function properly. There was an issue with the power distribution to the board and sensors. When attempting to supply an independent voltage source to the CMUcam4 a loss of function occurred; this was due to an improperly functioning board and as a result was issued a new camera. Using the new camera with the independent power source and Arduino board with a common grounding to the other system of the car resolved the issue. Additionally throughout the trials of testing code, the ping sensor experienced multiple collisions to the wall and other objects which as a result increase the possibility of decreased performance of the ping sensor causing possible error in the distance readings. Lastly, there were difficulties calibrating the CMUcam4 to the lighting environment inside the course area. While it tracked red objects in other areas,while in the court, it was detecting its own reflection off the white walls and sometimes tracking the red lines on the ground, making it difficult to track the actual target. The final code coupling the three degrees of freedom using all motors and sensors was successful and functions as intended, meeting all requirements stated above. The car,after initializing for five seconds will drive forward and avoid objects, and once the CMUcam4 tracks the color red within the confidence level, track the red colored object. The Proximity Check, forward driving, and object avoidance resumes once the color is no longer detected. This process is run autonomously and will continue until it power is no longer supplied.
  • 31. Table 4: Trial Run For Ping sensor with CMUcam4 Different values of Kp & Kd Settling time to reach steady state (seconds) Kp = .5 & Kd = 1.5 (Theoretical values) The motor did not run Kp = 1 & Kd = 2 1.6 Kp = 2 & Kd = 4 2.8 VIII. Appendix X.1 List of Figures Figure 7 CMUcam4 4 Figure 8 Parallax Ultrasonic Ping Sensor
  • 32. Figure 9 Arduino Mega 2560 Figure 10 I2C Communication Arduino Schematic
  • 33. Figure 11 Wiring and Power Supplies Configuration Figure 12 Wiring Configuration of the Arduino Board and Motor Shield
  • 34.
  • 35. Figure 14 State-flow Chart in Simulink Figure 13 State Transition Diagram
  • 36. Figure 15 - MatLab model of step response Figure 16 PD Controller (Simulink model)
  • 37. Figure 17 Simulink Auto-tune PD Control with Step Response Figure 18 CMUcam4 Confidence Threshold
  • 38. Figure 20 YUV ColorSpace Figure 19 RGB color space
  • 39. Reference Reference [1] – Pittman Motors. Chapter 8