SlideShare a Scribd company logo
1 of 13
Download to read offline
Design and Development of a ROS Enabled
All-Terrain Vehicle Platform
Benjamin B. Rhoades, Disha Srivastava, James M. Conrad
William Lee College of Engineering
University of North Carolina at Charlotte
Charlotte, North Carolina 28223
Email: brhoade1@uncc.edu, dsrivas2@uncc.edu, jmconrad@uncc.edu
Abstract—Writing software for robots is difficult, particularly
as the scale and scope of robotics continues to grow. Different
types of robots can have wildly varying hardware, making code
reuse nontrivial. On top of this, the sheer size of the required
code can be daunting, as it must contain a deep stack starting
from driver-level software and continuing up through abstract
reasoning, and beyond. To meet these challenges a Robotic
Operating System (ROS) framework was developed [1].
Using the ROS framework, this paper presents research
on designing and developing a ROS enabled ATV with an
Ackermann steering frame. The main goal of this project was to
allow an All-Terrain Vehicle (ATV) to either be controlled using
ROS or using the existing remote control mechanism developed
by previous work. A small scale system that utilized a DaNI 2.0
robotic platform was used to verify the base functionality needed
to operate as a base ROS enabled robot. In order to replicate
the small scale to the ATV, a novel Quadrature Rotary Encoder
(QRE) was designed and developed. Additional resolution was
added to the previous braking control system to accommodate
the high fidelity that ROS requires to control the ATV.
Index Terms—Robot Operating System (ROS), Quadrature
Rotary Encoder, Controller Area Network (CAN), All-Terrain
Vehicle (ATV)
I. INTRODUCTION
The University of North Carolina at Charlotte (UNCC) is
currently working in the research and development (R&D)
of an autonomous ATV. Multiple teams have worked on this
project since it started. Figure 1 shows an image of this ATV.
Fig. 1: Previous Setup of RC Controlled ATV[2]
1) Previous Work: Previous work had enabled the ATV to
be controlled via a remote control system. To fully mimic
a full size vehicle the Controller Area Network (CAN) bus
was simulated via CAN transceiver on the ATV. Multiple
micro-controllers were used to distribute and parse the control
messages for the ATV with its current design. Previous teams
have enabled the ATV to be fully controlled via a servo
controlled throttle, a braking actuator, and a steering assist
motor. These components enabled the ATV to be Radio
Controlled (RC).
2) System Overview: The system block diagram in Figure 2
illustrates the high level connection of the ATV. The National
Instruments (NI) myRIO acts as the primary controller to
which the ROS Master node subscribes & publishes the Twist
and Odometry information respectively.
Fig. 2: High Level Block Diagram of Current ATV
The Rx63N microcontroller translates these command mes-
sages into the appropriate CAN bus message format. These
messages travel via the CAN hi (CAN H) and CAN lo
(CAN L) wires to their respective CAN transceiver that then
passes to the Steering, Brake and Throttle Modules.
II. BACKGROUND
A. Robot Operating System (ROS)
The Robot Operating System (ROS) is a flexible framework
for writing robot software. It is a collection of tools, libraries
and conventions that aim to simplify the task of creating com-
plex and robust robot behavior across a wide variety of robotic
platforms.[3] So we can say that ROS is an open-source, meta-
operating system for a robot. It provides the services one
would expect from an operating system, including hardware
abstraction, low-level device control, implementation of com-
monly used functionality, message passing between processes
and package management.[4] The fundamental concepts of
the ROS implementation are nodes, messages, topics, and
services, Nodes are processes that perform computation. ROS
is designed to be modular at a fine-grained scale: a system is
typically comprised of many nodes. These nodes communicate
with each other by passing messages. A message is a strictly
typed data structure. It supports standard primitive data types
(integer, floating point, Boolean, etc.) as well as the arrays
of primitive types and constants. A node sends a message by
publishing it to a given topic, which can be a simple string
such as odometry or twist. A node that is interested in a certain
kind of data will subscribe to the appropriate topic. There
can be multiple concurrent publishers and subscribers for a
single topic, and a single node may publish and/or subscribe
to multiple topics. In general, publishers and subscribers are
not aware of each other. A service is defined by a string name
and a pair of strictly typed messages: one for the request and
one for the response.[1]
Fig. 3: Communication between ROS Master and ROS Node
ROS starts with the ROS Master. The Master allows all
other ROS Nodes to find and talk to each other as shown in
Figure 3.
B. Controller Area Network (CAN)
Fig. 4: Standard CAN Message Packet Structure[5]
Controller Area Network (CAN) is an asynchronous serial
communication protocol which follows ISO 11898 standards
and is widely accepted in automobiles due to its real time
performance, reliability and compatibility with wide range of
devices. [6] It was developed by BOSCH to enable commu-
nication between the various modules that were present in the
vehicle. CAN is a two wire differential bus with data rates up
to 1 Megabit per Second (Mbps) and offers a very high level of
security. CAN bus is a half-duplex, with two wire differential
bus, CAN L and CAN H, for the nodes to transmit data or
information. The bus uses two logic levels called dominant
and recessive levels, where dominant level is referred when
TTL = 0V and recessive level is referred when TTL = 5V.
The dominant level always overrides recessive level and this
concept is used to implement the bus arbitration.[6]
In the CAN protocol, nodes communicate data or in-
formation through messages termed as frames. A frame is
transmitted on to the bus only when the bus is in Idle state.
There are four different types of frames which are used for
communication over CAN bus.[5]
1) Data Frame - Used to send data
2) Remote Frame - Used to request data
3) Error Frame - Used to report an error condition
4) Overload Frame - Used to request a delay between two
data or remote frames
The architecture of Data and Remote frame are exactly the
same. A Data frame has higher priority than a Remote frame.
Each Data and Remote frame starts with a Start Of Frame
(SOF) field and end with an End Of Frame (EOF) field. The
figure 5 gives architecture of the Data and Remote frames.
Fig. 5: Architecture of Data Frame of CAN Bus[6]
Following are the fields in data and remote frame:
1) Message Identifier (11/29 bits) - This field contains a
message ID for each frame which is either 11 (standard
ID) or 29 (Extended ID) bits. No two message frames
in the CAN network should have the same message ID.
2) Data Field - This field contains the actual data which
can be of maximum 8 bytes.
The CAN protocol uses mailboxes for transmission and
reception of data. A transmission mailbox carries the data to
be transmitted onto the bus and the reception mailbox stores
the received data.
The CAN protocol allows us to assign priority based on
ID numbers. Accordingly, lower ID numbers are given higher
priority during transmission. Each message type is assigned an
ID and each mailbox is set up to receive specific message ID.
The code snippet in the figure 6 shows that BRAKE is given
the highest priority as compared to THROTTLE, BRAIN and
STEER.
Fig. 6: Code snippet : CAN Bus priority
C. Quadrature Rotary Encoder (QRE)
Quadrature encoders are regularly used in robotics to detect
and measure movement of the vehicle’s drive wheels. Quad
encoders are a refinement over single-channel encoders. While
they are more difficult to engineer – quadrature encoders
demand more precision in construction – they provide greater
resolution and they are capable of detecting changes in wheel
or motor direction. The name quadrature comes from the four
possible output states of the encoder sensor. Rather than just
a single pulse indicating a slotted or striped track in a code
wheel, quad encoders use a pair of tracks to go through four
distinct phases. [7] The track on the inside is set apart from
the outside track by 90 degrees as shown in figure 7.
Fig. 7: Proposed QRE layout for Rear ATV Wheels
This alternating sequence provides the four phases needed
for proper quadrature. Since there are two tracks so the encoder
require two sensors, one for each track. The two sensors pick
up the pattern changes in their respective tracks as shown in
figure 8.
The order in which the signal change from LOW to HIGH
in each channel indicates whether the encoder is rotating
clockwise or counter-clockwise. By detecting the order of
change one can tell whether the wheels of the robot are
spinning forward or backward.
D. DaNI 2.0 Robotic Platform
DaNI 2.0 is an out-of-the-box mobile robot platform with
sensors, motors and an NI 9632 Single-Board Reconfigurable
Fig. 8: Waveform output of the two sensors[7]
Fig. 9: DaNI 2.0 Robot[8]
I/O computer mounted on top of a Pitsco TETRIX erector
robot base as shown in figure9.[8]
The NI Single-Board RIO shown in figure 10, is an embed-
ded deployment platform that integrates a real-time processor,
reconfigurable field-programmable gate array (FPGA), and
analog and digital I/O on a single board. This board is
programmable with LabVIEW Real-Time, LabVIEW FPGA
and LabVIEW Robotics software modules.
The real-time processor runs the LabVIEW Real-Time Mod-
ule on the Wind River VxWorks real-time operating system
(RTOS) for extreme reliability and determinism. LabVIEW
contains built-in drivers and APIs for handling DMA or inter-
rupt request (IRQ) - based data transfer between the FPGA and
real-time processor. The 2.0 starter kit includes ultrasonic and
optical encoder sensors, but the reconfigurable I/O capability
allows us to expand the kit to experiment with a variety of
sensors including: LIDAR, Radar, Infrared, Compass, etc.[8]
E. Steering Theory
1) Differential Drive: DaNI 2.0 uses a drive mechanism
known as Differential drive. The robot has two drive wheels
mounted on a common axis as shown in figure 11, and each
wheel can independently be driven either forward or backward.
Fig. 10: NI Single-Board RIO 9632[8]
While the velocity of each wheel can vary, for the robot to
perform rolling motion, the robot must rotate about a point
that lies along its common left and right wheel axis. The point
that robot rotates about is called ICC(Instantaneous Centre of
Curvature).
Fig. 11: Differential Drive Kinematics[9]
By varying the velocities of the two wheels, the trajectories
that the robot takes can be varied. Because the rate of rotation
ω about the ICC must be the same for both wheels, we can
write the following equations:
ω(R+I/2) = Vr
ω(R−I/2) = Vl
where l is the distance between the centers of the two wheels,
Vr , Vl are the right and left wheel velocities along the ground
and R is the distance from the ICC to the midpoint between
the wheel.[9]
a) Forward Kinematics for Differential drive robots: In
figure 11, assume the robot is at some positon (x,y), headed
in a direction making an angle θ with the X-axis. We assume
the robot is centered at a point midway along the wheel axle.
By manipulating the control parameters Vl, Vr(wheel velocities
along the ground) we can get the robot to move to different
positions and orientations. Knowing velocities Vl, Vr we can
find the ICC location:
ICC = [xRsin(θ),y+Rcos(θ)]
and at time t +δt the robots pose will be:
Fig. 12: Position of robot
Fig. 13: Forward Kinematics for Differential Drive[9]
From figure 13 we can say that the motion of the robot
is equivalent to 1) translating the ICC to the origin of the
coordinate system, 2) rotating about the origin by an angular
amount ωδt, and 3) translating back to the ICC.
2) Ackermann Steering: The ATV uses Ackermann steering
drive. The steering angle in terms of traditional 4-wheel
vehicle dynamics does not apply to either the angle of the right
or left wheel, nor to the angle of the steering wheel/handlebar
column relative to the centerline of the vehicle. Instead, it is
referred to as the Ackerman steering angle which is the average
of the two steered wheels angle to vehicle centerline. This
is because the two steered wheels point at slightly different
angles, to avoid the tyres to slip when turning in a circle as
shown in the figures 14:
Ackerman Angle:
δ = (δ0 +δi)/2
Turning Radius:
R = L/δ
Where: δ0 = the angle of the outside wheel δi = the angle
of the inside steered wheel L = the axle to axle wheelbase t
= the track width (center of tire to center of tire)
Fig. 14: Ackermann Steering Profile[10]
III. METHODOLOGY
A. Current ATV Modules and Components
From previous R&D the ATV is currently controlled via a
RC remote controller operates by integrating various modules
CAN enabled modules together. Each module is built with
different hardware components that easily send and decipher
CAN messages. The following section explains the roles of
various modules and the hardware components used in them.
1) Remote Control: A wireless remote control system is
used for this project. The Spektrum DX6 transmitter and
BR6000 receiver are used to control the ATV. These are shown
in Figures 15 and 16. Since there is a transmitter-receiver
relationship of remote control, the control mechanism is the
same as the control signals for servos.
Fig. 15: The Spektrum DX6
Remote[11]
Fig. 16: The Spektrum
BR6000 Receiver[11]
To interpret these signals, pulse measurement techniques
were utilized on four output channels of the receiver. The
receiver was powered by the same 5 volt regulator that
powered the Renesas Rx63N board.
2) Renesas Rx63N microcontroller: The Renesas RX63N
development board shown in figure 17 provides a useful
platform for evaluating the Renesas suite of development
tools for coding and debugging, using the High-performance
Embedded Workshop (HEW) IDE as well as programming the
device using the on-board SEGGER J-Link JTAG debugger.
It operates at up to 100 MHz and has a built-in flash. A
broad memory lineup from 2 MB to 256 KB built-in Flash (+
ROMless) to 256 KB to 64 KB built-in RAM is available.[13]
The Rx63N is mainly used to utilize the CAN transceiver
Fig. 17: Renesas Rx63N microcontroller[12]
module, due to the fact that it has a CAN API. This means it
can accept analog values from either ROS node or the remote
control and translate that into CAN H and CAN L messages
and send it in the Data frame.
3) Brake and Throttle Module: The Brake and throttle
module shown in figure 18 is responsible for the operation of
the actuator used for the braking mechanism and servo motor
that facilitates the locomotion of the ATV.
Fig. 18: Brake and Throttle Module
a) Linear Actuator: The Duff-Norton linear actuator as
shown in figure 19 features load capacities ranging from 27
pounds to 25 Tons, gear or belt driven, acme screw and ball
screw systems with various AC and DC input voltages.
It is specially designed for a variety of commercial and
industrial applications. Applications include gates, dampers,
oven and processing tank doors, antennas, ergonomic furniture,
and agricultural equipment,[15] etc. The ATV uses the actuator
to facilitate braking mechanism with varying stroke length.
Fig. 19: Duff Norton Linear Actuator[14]
b) Throttle Servo: As with many vehicles, the throttle on
the Honda ATV has a spring return. When the throttle is not
engaged, it returns to an idle position. When the throttle is in
an idle position, the vehicle does not begin to move or will
come to a slow stop. This is advantageous because the brake
does not need to be applied in most operating conditions, and
even downhill slopes may only require braking to come to
a complete stop.[11] A standard servo, shown in Figure 20,
from parallax was selected to control the throttle.
Fig. 20: Parallax Standard
Servo[11]
Fig. 21: Servo attached to
the throttle assembly [11]
Figure 21 shows the servo attached to the throttle assembly.
Note that the throttle can be manually controlled when the
servo is not powered, which allows a rider to turn off the
control circuits and drive the ATV manually.
c) CAN Transceiver: The SN65HVD251 CAN
transceiver shown in figure 22 is intended for use in
applications employing the CAN serial communication
physical layer in accordance with the ISO 11898 Standard.
It provides differential transmit capability to the bus and
differential receive capability to a CAN controller at speeds
up to 1 megabits per second (Mbps).
Fig. 22: SN65HVD251 CAN Transceiver[16]
Designed for operation in harsh environments, the device
features cross-wire, over-voltage and loss of ground protection
to 36 V. The common-mode range is -7 V to 12 V, and
has a tolerance to transients of ±200 V.[17] The transceiver
interfaces the single-ended CAN controller at the Brake and
Throttle node with the differential CAN bus connecting di-
rectly with a CAN bus module in Renesas Rx63N.
d) Logic Level Converter: The SparkFun bi-directional
logic level converter shown in figure 23 is a small device that
safely steps down 5V signals to 3.3V AND steps up 3.3V to
5V at the same time.[18] The voltages can be set high and
low and step up and down between them safely on the same
channel.
Fig. 23: SparkFun Logic level converter[18]
The board needs to be powered from the two voltages
sources (high voltage and low voltage). The Logic level shifter
is used to compensate the difference in operating voltages
of the Servo and the Sakura board which are 5V and 3.3V
respectively.
e) Renesas Sakura board: SAKURA board (Arduino
compatible) is based on Renesas RX63N series 32-bit mi-
crocontroller. It has on-chip flash memory and enhanced
communication functions, including an Ethernet controller and
USB 2.0 Host/Function.
Fig. 24: SAKURA Board[19]
The Renesas SAKURA board shown in figure 24 can
be programmed with the Cloud base compiler supported by
Renesas.[20] Detailed Technical specifications of the Sakura
are as follows:
• Microcontroller : RX63N(R5F563NBDDFP)
• Operating Voltage : 3.3V
• Clock Speed : 96MHz
• Digital I/O Pins : 55
• Analog Input Pins : 16
• Flash Memory : 1MB
• RAM : 128KB
• USB- Function : (mini-B)
The Sakura board is used because it has the CAN API
for transmitting the CAN bus messages. The Sakura board
provides both the CAN API as well as compatibility with
Arduino. It receives messages from the servo via level con-
verter and transmits it to the motor controller connected to the
braking actuator.
4) Steering Module: The ATV is equipped with an elec-
tronic power steering (EPS) module, as shown in Figures
25 and 26 which comprises of the Torque sensor (figure28),
Power steering motor (figure27) and Steering angle encoder
(figure29).
Fig. 25: Steering Module
Fig. 26: Electronic Power
Steering [11]
a) Power steering motor: The EPS provides power to the
motor in response to a torque sensor attached to the steering
column. As the rider turns the handle bars, torque is applied to
the steering shaft, which is translated into a need for steering
assistance.
Fig. 27: Power Steering Motor [2]
The DC resistance of the Power steering motor shown in
figure 27 is approximately 0.33 ohms. The EPS can be fooled
into thinking that the torque sensor was transmitting a signal.
The EPS would then provide power to the steering motor;
thereby eliminating the need to develop a control system for
the motor.
b) Torque Sensor: At the heart of the Electric Power
Steering systems operation is the torque sensor shown in figure
28. A deformable element is installed between the operators
handle bar and the steered wheels linkage. A resistive sensor
array is placed at the interface of the two sections, and when a
differential angular displacement is detected on the junction, a
directionally dependent change in resistance is given as output
by the sensor.
The sensor comprises of two completely independent resis-
tive elements - one for each direction (right/left). A third wire
as shown in figure 28 is connected as a ground reference.
c) Steering Angle Encoder: From the prior work on the
steering, a US Digital MAE3 Digital Magnetic Encoder shown
in figure 29 was affixed to the steering column to provide
feedback for steering angle closed loop control. The encoder
Fig. 28: Torque Sensor[21]
employs absolute position encoding and a 10bit D/A converter,
allowing it to essentially function as an analog potentiometer.
In its connected configuration, the potentiometer has a 5V full
scale range. Since not all of it is used because of the limited
lock-to-lock steering angle so it is calibrated to output under
5V supply.
Fig. 29: Steering Angle Encoder[21]
The Encoder, as configured for Analog Output has a 10bit
resolution. This yields a functional resolution of approximately
3.56 mV/bit. This is then imported into the Microcontroller via
a 10bit A/D, preserving this resolution if interference is not
considered.
B. Proposed Components and Methods
1) National Instruments (NI) myRIO: NI myRIO shown in
figure 30 places dual-core ARM Cortex-A9 real-time process-
ing and Xilinx FPGA customizable I/O into a single module.
RIO feature a processor and FPGA, both of which are fully
programmable using NI LabVIEW software. The NI myRIO is
used as it is programmable using LabVIEW Robotics toolkit.
This toolkit has the steering frame which is used to set three
different modes: User-defined, Differential and Ackermann.
Figure 31 shows the Ackermann steering frame. The frame
needs four input values namely, Wheel radius, Gear ratio,
Wheel separation width and Wheel separation length. From
these parameters program calculates the necessary kinematics
needed to send proper signals to the motor controllers and
other steering and drive related devices.
Fig. 30: NI myRIO[22]
Fig. 31: Ackermann Steering frame Wizard in LabVIEW
Robotics
2) ROS Master: The ROS Master Node runs on a Linux
Operating System (Ubuntu). The version of ROS incorporated
for our research is ROS Indigo. The ROS Master provides
naming and registration services to the rest of the nodes in
the ROS system. It tracks publishers and subscribers to topics
as well as services. The role of the Master is to enable ROS
node, DaNI 2.0 to be able to communicate with the Master
node. To start the Master node, ’roscore’ command is entered
in the terminal.
The ’roscore’ command does not take any arguments and
the master should continue running for the entire time the
ROS is used. The master can be stopped by typing Ctrl+C in
its terminal. The ROS master generates a TCP port number
as shown in figure 32. This port is used by the ROS node for
communicating with the master.
Now run the ’teleop twist keyboard’ command shown in
figure 33 which is generic keyboard teleop for twist robots.
This command enables us to accept real-time inputs from the
keyboard and publish it to Twist.
3) ROS Node (DaNI 2.0): A small scale prototype of ATV,
DaNI 2.0 was used to test the functioning of the system. DaNI
2.0 is programmable with LabVIEW Real-Time, LabVIEW
FPGA and LabVIEW Robotics software modules which were
used to publish and subscribe to odometry and twist commands
information.
Fig. 32: ROS Master running in a Linux environment (Ubuntu)
Fig. 33: ROS Default Teleop Window used to Control the
Subscribing Robot
ROS plugin for LabVIEW as shown in figure 34 developed
by TUFTS University. This plugin provides a list all the
ROS Topic frames and functions for sending and receiving
data from the subscribing robot to the publishing computer.
The Teleop message packet shown in figure 35 is used to
remotely operate the subscribing robot. The basic commands
that are sent from the keyboard are used move or actuate the
robots frame and motors.
The ROS for LabVIEW toolkit enables the user to simply
define what type of message will be published or subscribed
to. In the example shown in Figure 36 the ROS message will
publish Twist commands that are sent to the robot enabling it
to move. The Twist command is part of the command velocity
(cmd vel) parent topic.
The Odometry message packet in figure 37 is used to send
position and pose information to the subscribing laptop. The
Odometry message enables ROS to track where the robot is
in a space. The Odometry message is the second vital key in
Fig. 34: ROS for LabVIEW Toolkit
Fig. 35: Front panel that contains the structure of a Twist
message, a node parameter, and error detection
enabling a robotic platform to become a standard ROS base
robot. The QRE on the ATV was essential to enable the correct
feedback information to be passed to the ROS master node to
calculate the present and future trajectories of the robot.
The ROS for LabVIEW toolkit also enables the user to
publish Odometry information back to the master node laptop.
In the example shown in Figure 38 the ROS message will
subscribe to the laptop and receive the Odometry information
from the robot. This message is then parsed into its respective
format for viewing by the user. The Odometry message is a
navigation message that is part of the odom parent topic.
Every item in the ROS Toolkit for LabVIEW is dependent
on the toolkit finding and establishing communication with the
ROS master node. In the screenshot in Figure 39 a prompt is
shown to the user querying about the master IP address of the
master node. It is also vital that the laptop and the wireless
router on the robot be on the same local network.
Due to the abstraction level of ROS the end user is left to
determine how to interpret and distribute the Twist commands
to enable the robot to move, stop, turn, etc. The robot in turn,
must be able to publish the Odometry data structure back to
the ROS master node. For the ATV, to best simulate an actual
automobile the Twist commands and accompanying Odometry
messages will be distributed using the CAN bus protocol. Each
Fig. 36: The block diagram of publishing Twist commands to
control the Robot
Fig. 37: Front panel that contains a structure of Odometry
message, a node parameter and error detection
node takes the command received from the CAN bus and
drives their respective components. This aids in simulating a
similar setup in a full size vehicle.
4) Quadrature Rotary Encoders: A pair of Quadrature
rotary encoders added on the rear wheels of the ATV for
odometry measurements that are required by ROS to determine
the Spatial coordinates of the ATV in space. The Encoder
comprises of two sets of three Infrared proximity sensors -
Sharp GP2D120, one on each of the rear wheels as shown in
figure 40 and 41. The encoder disc has a minimum resolution
of 11 degrees.
a) Sharp GP2D120 IR Sensor: The GP2D120 in figure
42 is a distance measuring sensor with integrated signal
processing and analog voltage output.
Following are the technical specifications of the IR sensor:
1) Effective range: 4 to 30 cm
2) Output: Analog
3) Typical response time: 39 ms
4) Typical start up delay: 44 ms
5) Average current consumption: 33 mA
It is composed of an integrated combination of PSD (posi-
tion sensitive detector), IR-LED and signal processing circuit.
The varying reflectivity of the object, the environmental tem-
Fig. 38: Block diagram of a ROS subscription to Odometry
messages to receive feedback from the Robot
Fig. 39: Establishing the ROS Master IP in the LabView Main
Front Panel
perature and the operating duration do not easily influence
the distance detection because of adopting the triangulation
method. This device outputs the voltage corresponding to
the detection distance. So this sensor can also be used as a
proximity sensor.[23].The sensors are used to calculate the
velocity and determine the direction of rotation of the wheel.
The outermost sensor in figure 43 is used to count the
number of pulses and determine the speed of the vehicle. For
calculating the speed we use the formula:
S = 2πr/t
where,
r = radius of the wheel
t = time taken to complete one revolution
The code in figure 44 shows that one revolution consists
of 8 pulses. This is because the sensors used for the encoders
have a specification that the width of each strip should be more
than twice the width of the sensor itself.
As we have 16 partitions on the encoder disc(8 black &
8 white) we get a resolution of 22.5 deg. But since we are
using a quadrature encoder, we have set the middle ring to
be 90 degree ahead of the outer ring in phase. By co-relating
the values from both the outer and the middle sensor, we can
reduce the resolution down to 11.25 deg.
The outermost and middle sensors are used to determine the
direction of the wheel. When the code as shown in figure 45
Fig. 40: QRE setup on the ATV with Marker for Rotation
(Left-Side)
Fig. 41: QRE setup on the ATV with Marker for Rotation
(Right-Side)
finds a low to high transition on the outermost sensor, it checks
to see if the middle sensor is low or high then it displays
the direction of rotation as Clockwise or Counter-clockwise
respectively.
The innermost sensor has two function: a) to mark the
starting point for the outermost sensor to count the pulses and
b) to determine the angle of rotation if the distance traversed
by the wheel is less than 55 degrees. We used four pulses
for the innermost sensor to reduce the margin of error in
determining the starting point. If there was a single pulse, the
error would be 349 degrees but with four pulses it decreases
to 55 degrees.
Figure 46 shows a code snippet for the angle measurement.
We converted all the possible combination of starting and
ending points within the 55 degrees sector between two
consecutive innermost black stripes to corresponding decimal
values. For example, if we consider the initial value as 0
and final value as 2 then this combination corresponds to the
angular rotation of 33 degrees.
Fig. 42: SHARP GP2D120 IR Sensor[23]
Fig. 43: Three IR Sensors mounted on the ATV
5) Brake Actuator: Previous teams had programmed the
brake actuator as shown in figure 47 to operate on two settings:
full release and full brake. We added more resolution to the
speed and stroke length of the actuator since ROS requires
variable speeds of braking to perform various mapping and
other maneuvers.
Now the brake can be applied at different speeds for
different road conditions. The table I shows the different stroke
length for different conditions of braking. The first column
shows the time required for application of brake corresponding
to the road condition.
IV. RESULTS
A. Quadrature Rotary Encoder
The graph in Figure 48 depicts the real time test of the QRE
based system implemented on the All-Terrain Vehicle. The
test was performed over a two minute window of time where
the ATV was at an idle position for approximately the first
minute of the test while the last minute of the test a gradual
acceleration was applied to the throttle to produce a consistent
acceleration. The QRE showed a speed of approximately two
miles-per-hour (mph) at the idle state.
As shown in the graph in Figure 48 when the ATV was
accelerated to its maximum speed for first gear the QRE
Time(sec) Stroke length(cm) Brake
1 4.1 Full Release
1.5 5.1 Quarter Brake
2 6.1 Half Brake
2.2 6.4 Three fourth quarter Brake
2.5 7.2 Full Brake
TABLE I: Table showing the results for Resolution achieved
in Braking mechanism
Fig. 44: Code snippet-Calculation of the speed of ATV
Fig. 45: Code snippet-Direction of rotation of the wheels
showed a linear relationship with respect to the speed of the
ATV. The ATV was returned to an idle state. This concluded
the QRE test setup and data recording shown in figure 49.
Fig. 46: Code snippet-Measurement of angle of rotation before
the starting point
Fig. 47: Setup of Actuator on ATV
V. CONCLUSION
The purpose of this research was to allow an existing RC
controlled All-Terrain Vehicle to be ROS enabled. Previous
work had enabled the robot to be successfully controlled via
an RC controller and receiver. With the base robot working
there was a need to add more sensors to the robot. The
ROS framework was chosen for its open-source format and
community development. ROS gives the user the ability to
deploy elaborate localization and mapping techniques with
little effort. Since ROS operates at a high level the user is
left with the task of enabling the commands sent by the ROS
master to be properly translated to the robots motors, actuators,
etc. Likewise, the user must package the feedback from the
wheels and other sensors in the exact data structure of the
Odometry message.
The novel QRE that was developed for the ATV enabled it
to send back this information to the ROS master node. The
additional resolution provided by the new braking algorithm
provides a higher fidelity of movement to the braking actuator
and thus results in a fine-grained control of the ATV. With
the development of the QRE and addition of the brake motor
resolution the ATV is now ready to be tested with any
localization and mapping technique that the end users desires.
Lastly, with the ROS framework, the user can add a multitude
of off-the-shelf (OTS) sensors and components to the base
ATV to even further development.
Fig. 48: Graph of QRE Test Results on ATV
Fig. 49: QRE Test Setup (Rear Wheels of the ATV Where
Elevated as Shown)
VI. FUTURE WORK
With the base ROS framework now able to be deployed on
the ATV future teams can begin to add more sensors to the
robot. Mainly, the SICK LMS200 1D LIDAR will be added
to enable the robots surrounds to be viewed in the native
ROS RVIZ environment. The image in Figure 50 shows the
currently mounted LMS200 LIDAR sensor on the ATV.
The image in Figure 51 shows a screenshot of the RVIZ
software that was used to map out an existing building. The
ROS framework enables sensors, such as the LMS200 LIDAR,
to be deployed on the ATV quickly. All the user must do is
simply download the ROS driver for that sensor and it will
automatically be ported into the robots underlying architecture.
With the simplistic design of the QRE and the braking
resolution provided with the braking algorithm, researches can
easily port their particular robotic platform into a ROS enabled
platform. The QRE and the braking resolution algorithm
provides the necessary framework to, theoretically, make any
current robotic platform ROS enabled.
REFERENCES
[1] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs,
R. Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,”
in ICRA workshop on open source software, vol. 3, no. 3.2, 2009, p. 5.
Fig. 50: Currently mounted LMS200 LIDAR sensor
Fig. 51: Screenshot of ROSs Native RVIZ Environment[24]
[2] A. Cortner, J. M. Conrad, and N. A. BouSaba, “Autonomous all-terrain
vehicle steering,” in Southeastcon, 2012 Proceedings of IEEE. IEEE,
2012, pp. 1–5.
[3] 2016. [Online]. Available: http://www.ros.org/about-ros
[4] 2016. [Online]. Available: http://wiki.ros.org/ROS/Introduction
[5] 2016. [Online]. Available: https://en.wikipedia.org/wiki/CAN bus
[6] S. K. R. Gurram, “Implementation of controller area network (can)
bus in an autonomnous all-terrain vehicle,” Ph.D. dissertation, The
University of North Carolina at Charlotte, 2011.
[7] 2016. [Online]. Available: http://www.robotoid.com/appnotes/
circuits-quad-encoding.html
[8] 2016. [Online]. Available: https://www.pitsco.com/About/?art=5000
[9] E. Lee, “The problem with threads,” Computer, vol. 39, no. 5, pp. 33–42,
2006.
[10] 2016. [Online]. Available: http://www.diracdelta.co.uk/science/source/a/
c/ackermann%20steering/source.html#.Vx1OlzArLIU
[11] R. A. McKinney, M. J. Zapata, J. M. Conrad, T. W. Meiswinkel, and
S. Ahuja, “Components of an autonomous all-terrain vehicle,” in IEEE
SoutheastCon 2010 (SoutheastCon), Proceedings of the. IEEE, 2010,
pp. 416–419.
[12] 2016. [Online]. Available: http://www.renesas.com/products/tools/
introductory evaluation tools/renesas demo kits/yrdkrx63n/index.jsp
[13] 2016. [Online]. Available: http://www.am.renesas.com/products/tools/
introductory evaluation tools/renesas demo kits/yrdkrx63n/index.jsp
[14] 2016. [Online]. Available: http://www.minutemancontrols.com/
duff-norton main.html
[15] 2016. [Online]. Available: http://www.duffnorton.com/products.aspx?
id=7841
[16] 2016. [Online]. Available: http://www.ti.com/product/SN65HVD230
[17] 2016. [Online]. Available: http://www.ti.com/product/SN65HVD251
[18] S. Bi-Directional, S. OpenLog, S. APDS-9960, X. (802.15.4),
L. Analyzer, S. Breakout, and S. Breakout, “Sparkfun logic level
converter - bi-directional - bob-12009 - sparkfun electronics,” 2016.
[Online]. Available: https://www.sparkfun.com/products/12009
[19] 2016. [Online]. Available: http://sakuraboard.net/gr-sakura en.html
[20] 2016. [Online]. Available: http://www.rs-online.com/designspark/
electronics/nodes/view/type:design-centre/slug:GR-Sakura
[21] J. R. Henderson, J. M. Conrad, and C. Pavlich, “Using a can bus for
control of an all-terrain vehicle,” in SOUTHEASTCON 2014, IEEE.
IEEE, 2014, pp. 1–5.
[22] 2016. [Online]. Available: http://www.ni.com/myrio/what-is/
[23] 2016. [Online]. Available: https://www.pololu.com/product/1136
[24] 2016. [Online]. Available: http://www.iroboapp.org/images/f/f8/Rviz2.
png

More Related Content

What's hot

EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPT
EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPTEC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPT
EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPTbabuece
 
(Paper Presentation) DSDV
(Paper Presentation) DSDV(Paper Presentation) DSDV
(Paper Presentation) DSDVRajesh Piryani
 
3 gppevolutionwp
3 gppevolutionwp3 gppevolutionwp
3 gppevolutionwppavel
 
Paper lte-protocol-signaling
Paper lte-protocol-signalingPaper lte-protocol-signaling
Paper lte-protocol-signalingPetrona Frensel M
 
Ccna day5-140715152501-phpapp01
Ccna day5-140715152501-phpapp01Ccna day5-140715152501-phpapp01
Ccna day5-140715152501-phpapp01Sachin Morya
 
Unit VIII wireless sensor networks
Unit VIII wireless sensor networksUnit VIII wireless sensor networks
Unit VIII wireless sensor networkssangusajjan
 
Direct Link Lan
Direct Link LanDirect Link Lan
Direct Link Lanyanhul
 
Mac adhoc (1)
Mac adhoc (1)Mac adhoc (1)
Mac adhoc (1)hinalala
 
LTE Training Course
LTE Training CourseLTE Training Course
LTE Training CourseChiehChun
 
Architecture of the lte air interface
Architecture of the lte air interfaceArchitecture of the lte air interface
Architecture of the lte air interfaceEke Okereke
 
Packet Guide SONET/SDH
Packet Guide SONET/SDHPacket Guide SONET/SDH
Packet Guide SONET/SDHscribd1
 

What's hot (20)

Lte imp
Lte impLte imp
Lte imp
 
EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPT
EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPTEC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPT
EC 6802 WIRELESS NETWORK_ BABU M_ unit 3 ,4 & 5 PPT
 
IJSTEV2I12120
IJSTEV2I12120IJSTEV2I12120
IJSTEV2I12120
 
(Paper Presentation) DSDV
(Paper Presentation) DSDV(Paper Presentation) DSDV
(Paper Presentation) DSDV
 
3 gppevolutionwp
3 gppevolutionwp3 gppevolutionwp
3 gppevolutionwp
 
Wcdma channels
Wcdma channels Wcdma channels
Wcdma channels
 
Paper lte-protocol-signaling
Paper lte-protocol-signalingPaper lte-protocol-signaling
Paper lte-protocol-signaling
 
3GPP LTE-MAC
3GPP LTE-MAC3GPP LTE-MAC
3GPP LTE-MAC
 
Ccna day5-140715152501-phpapp01
Ccna day5-140715152501-phpapp01Ccna day5-140715152501-phpapp01
Ccna day5-140715152501-phpapp01
 
Unit VIII wireless sensor networks
Unit VIII wireless sensor networksUnit VIII wireless sensor networks
Unit VIII wireless sensor networks
 
Direct Link Lan
Direct Link LanDirect Link Lan
Direct Link Lan
 
Ccna day5
Ccna day5Ccna day5
Ccna day5
 
Mac adhoc (1)
Mac adhoc (1)Mac adhoc (1)
Mac adhoc (1)
 
LTE Training Course
LTE Training CourseLTE Training Course
LTE Training Course
 
Ccna day3
Ccna day3Ccna day3
Ccna day3
 
Architecture of the lte air interface
Architecture of the lte air interfaceArchitecture of the lte air interface
Architecture of the lte air interface
 
Wi Max
Wi MaxWi Max
Wi Max
 
Lte protocols
Lte protocolsLte protocols
Lte protocols
 
Paper lte-interoperable
Paper lte-interoperablePaper lte-interoperable
Paper lte-interoperable
 
Packet Guide SONET/SDH
Packet Guide SONET/SDHPacket Guide SONET/SDH
Packet Guide SONET/SDH
 

Similar to ROS Enabled All-Terrain Vehicle Design

Can basics
Can basicsCan basics
Can basicscdackp
 
The Design of an MVB Communication Controller Based on an FPGA
The Design of an MVB Communication Controller Based on an FPGAThe Design of an MVB Communication Controller Based on an FPGA
The Design of an MVB Communication Controller Based on an FPGAIJRESJOURNAL
 
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2IAEME Publication
 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...
    | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...    | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...IJMER
 
Controller area network as the security of the vehicles
Controller area network as the security of the vehiclesController area network as the security of the vehicles
Controller area network as the security of the vehiclesIAEME Publication
 
Simulation of LTE Network Parameters
Simulation of  LTE Network ParametersSimulation of  LTE Network Parameters
Simulation of LTE Network ParametersIRJET Journal
 
Controller Area Network (CAN) Different Types
Controller Area Network (CAN) Different TypesController Area Network (CAN) Different Types
Controller Area Network (CAN) Different TypesFebinShaji9
 
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...ijsrd.com
 
Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular Communication  Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular Communication Editor IJCATR
 
Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular CommunicationCollision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular CommunicationEditor IJCATR
 
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIP
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIPPERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIP
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIPVLSICS Design
 
International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Harini_Mobile_Robotics
Harini_Mobile_RoboticsHarini_Mobile_Robotics
Harini_Mobile_RoboticsHarini Suresh
 

Similar to ROS Enabled All-Terrain Vehicle Design (20)

Can Protocol For Automobiles
Can Protocol For AutomobilesCan Protocol For Automobiles
Can Protocol For Automobiles
 
Can basics
Can basicsCan basics
Can basics
 
The Design of an MVB Communication Controller Based on an FPGA
The Design of an MVB Communication Controller Based on an FPGAThe Design of an MVB Communication Controller Based on an FPGA
The Design of an MVB Communication Controller Based on an FPGA
 
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2
PERFORMANCE ANALYSIS OF AODV, DSDV AND AOMDV USING WIMAX IN NS-2
 
11.chapters
11.chapters11.chapters
11.chapters
 
Shubham chakravarty ppt_wcan
Shubham chakravarty ppt_wcanShubham chakravarty ppt_wcan
Shubham chakravarty ppt_wcan
 
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...
    | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...    | IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...
| IJMER | ISSN: 2249–6645 | www.ijmer.com | Vol. 4 | Iss. 4 | April 2014 ...
 
Controller area network as the security of the vehicles
Controller area network as the security of the vehiclesController area network as the security of the vehicles
Controller area network as the security of the vehicles
 
Simulation of LTE Network Parameters
Simulation of  LTE Network ParametersSimulation of  LTE Network Parameters
Simulation of LTE Network Parameters
 
Controller Area Network (CAN) Different Types
Controller Area Network (CAN) Different TypesController Area Network (CAN) Different Types
Controller Area Network (CAN) Different Types
 
Dl34689693
Dl34689693Dl34689693
Dl34689693
 
Epma 013
Epma 013Epma 013
Epma 013
 
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...
Simulation and Performance Analysis of Long Term Evolution (LTE) Cellular Net...
 
Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular Communication  Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular Communication
 
Collision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular CommunicationCollision Avoidance Protocol for Inter Vehicular Communication
Collision Avoidance Protocol for Inter Vehicular Communication
 
P26093098
P26093098P26093098
P26093098
 
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIP
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIPPERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIP
PERFORMANCE EVALUATION OF CDMAROUTER FOR NETWORK - ON - CHIP
 
International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)
 
Role of CAN BUS in automotives
Role of CAN BUS in automotivesRole of CAN BUS in automotives
Role of CAN BUS in automotives
 
Harini_Mobile_Robotics
Harini_Mobile_RoboticsHarini_Mobile_Robotics
Harini_Mobile_Robotics
 

ROS Enabled All-Terrain Vehicle Design

  • 1. Design and Development of a ROS Enabled All-Terrain Vehicle Platform Benjamin B. Rhoades, Disha Srivastava, James M. Conrad William Lee College of Engineering University of North Carolina at Charlotte Charlotte, North Carolina 28223 Email: brhoade1@uncc.edu, dsrivas2@uncc.edu, jmconrad@uncc.edu Abstract—Writing software for robots is difficult, particularly as the scale and scope of robotics continues to grow. Different types of robots can have wildly varying hardware, making code reuse nontrivial. On top of this, the sheer size of the required code can be daunting, as it must contain a deep stack starting from driver-level software and continuing up through abstract reasoning, and beyond. To meet these challenges a Robotic Operating System (ROS) framework was developed [1]. Using the ROS framework, this paper presents research on designing and developing a ROS enabled ATV with an Ackermann steering frame. The main goal of this project was to allow an All-Terrain Vehicle (ATV) to either be controlled using ROS or using the existing remote control mechanism developed by previous work. A small scale system that utilized a DaNI 2.0 robotic platform was used to verify the base functionality needed to operate as a base ROS enabled robot. In order to replicate the small scale to the ATV, a novel Quadrature Rotary Encoder (QRE) was designed and developed. Additional resolution was added to the previous braking control system to accommodate the high fidelity that ROS requires to control the ATV. Index Terms—Robot Operating System (ROS), Quadrature Rotary Encoder, Controller Area Network (CAN), All-Terrain Vehicle (ATV) I. INTRODUCTION The University of North Carolina at Charlotte (UNCC) is currently working in the research and development (R&D) of an autonomous ATV. Multiple teams have worked on this project since it started. Figure 1 shows an image of this ATV. Fig. 1: Previous Setup of RC Controlled ATV[2] 1) Previous Work: Previous work had enabled the ATV to be controlled via a remote control system. To fully mimic a full size vehicle the Controller Area Network (CAN) bus was simulated via CAN transceiver on the ATV. Multiple micro-controllers were used to distribute and parse the control messages for the ATV with its current design. Previous teams have enabled the ATV to be fully controlled via a servo controlled throttle, a braking actuator, and a steering assist motor. These components enabled the ATV to be Radio Controlled (RC). 2) System Overview: The system block diagram in Figure 2 illustrates the high level connection of the ATV. The National Instruments (NI) myRIO acts as the primary controller to which the ROS Master node subscribes & publishes the Twist and Odometry information respectively. Fig. 2: High Level Block Diagram of Current ATV The Rx63N microcontroller translates these command mes- sages into the appropriate CAN bus message format. These messages travel via the CAN hi (CAN H) and CAN lo (CAN L) wires to their respective CAN transceiver that then passes to the Steering, Brake and Throttle Modules. II. BACKGROUND A. Robot Operating System (ROS) The Robot Operating System (ROS) is a flexible framework for writing robot software. It is a collection of tools, libraries and conventions that aim to simplify the task of creating com- plex and robust robot behavior across a wide variety of robotic
  • 2. platforms.[3] So we can say that ROS is an open-source, meta- operating system for a robot. It provides the services one would expect from an operating system, including hardware abstraction, low-level device control, implementation of com- monly used functionality, message passing between processes and package management.[4] The fundamental concepts of the ROS implementation are nodes, messages, topics, and services, Nodes are processes that perform computation. ROS is designed to be modular at a fine-grained scale: a system is typically comprised of many nodes. These nodes communicate with each other by passing messages. A message is a strictly typed data structure. It supports standard primitive data types (integer, floating point, Boolean, etc.) as well as the arrays of primitive types and constants. A node sends a message by publishing it to a given topic, which can be a simple string such as odometry or twist. A node that is interested in a certain kind of data will subscribe to the appropriate topic. There can be multiple concurrent publishers and subscribers for a single topic, and a single node may publish and/or subscribe to multiple topics. In general, publishers and subscribers are not aware of each other. A service is defined by a string name and a pair of strictly typed messages: one for the request and one for the response.[1] Fig. 3: Communication between ROS Master and ROS Node ROS starts with the ROS Master. The Master allows all other ROS Nodes to find and talk to each other as shown in Figure 3. B. Controller Area Network (CAN) Fig. 4: Standard CAN Message Packet Structure[5] Controller Area Network (CAN) is an asynchronous serial communication protocol which follows ISO 11898 standards and is widely accepted in automobiles due to its real time performance, reliability and compatibility with wide range of devices. [6] It was developed by BOSCH to enable commu- nication between the various modules that were present in the vehicle. CAN is a two wire differential bus with data rates up to 1 Megabit per Second (Mbps) and offers a very high level of security. CAN bus is a half-duplex, with two wire differential bus, CAN L and CAN H, for the nodes to transmit data or information. The bus uses two logic levels called dominant and recessive levels, where dominant level is referred when TTL = 0V and recessive level is referred when TTL = 5V. The dominant level always overrides recessive level and this concept is used to implement the bus arbitration.[6] In the CAN protocol, nodes communicate data or in- formation through messages termed as frames. A frame is transmitted on to the bus only when the bus is in Idle state. There are four different types of frames which are used for communication over CAN bus.[5] 1) Data Frame - Used to send data 2) Remote Frame - Used to request data 3) Error Frame - Used to report an error condition 4) Overload Frame - Used to request a delay between two data or remote frames The architecture of Data and Remote frame are exactly the same. A Data frame has higher priority than a Remote frame. Each Data and Remote frame starts with a Start Of Frame (SOF) field and end with an End Of Frame (EOF) field. The figure 5 gives architecture of the Data and Remote frames. Fig. 5: Architecture of Data Frame of CAN Bus[6] Following are the fields in data and remote frame: 1) Message Identifier (11/29 bits) - This field contains a message ID for each frame which is either 11 (standard ID) or 29 (Extended ID) bits. No two message frames in the CAN network should have the same message ID. 2) Data Field - This field contains the actual data which can be of maximum 8 bytes. The CAN protocol uses mailboxes for transmission and reception of data. A transmission mailbox carries the data to be transmitted onto the bus and the reception mailbox stores the received data. The CAN protocol allows us to assign priority based on ID numbers. Accordingly, lower ID numbers are given higher priority during transmission. Each message type is assigned an ID and each mailbox is set up to receive specific message ID. The code snippet in the figure 6 shows that BRAKE is given the highest priority as compared to THROTTLE, BRAIN and STEER.
  • 3. Fig. 6: Code snippet : CAN Bus priority C. Quadrature Rotary Encoder (QRE) Quadrature encoders are regularly used in robotics to detect and measure movement of the vehicle’s drive wheels. Quad encoders are a refinement over single-channel encoders. While they are more difficult to engineer – quadrature encoders demand more precision in construction – they provide greater resolution and they are capable of detecting changes in wheel or motor direction. The name quadrature comes from the four possible output states of the encoder sensor. Rather than just a single pulse indicating a slotted or striped track in a code wheel, quad encoders use a pair of tracks to go through four distinct phases. [7] The track on the inside is set apart from the outside track by 90 degrees as shown in figure 7. Fig. 7: Proposed QRE layout for Rear ATV Wheels This alternating sequence provides the four phases needed for proper quadrature. Since there are two tracks so the encoder require two sensors, one for each track. The two sensors pick up the pattern changes in their respective tracks as shown in figure 8. The order in which the signal change from LOW to HIGH in each channel indicates whether the encoder is rotating clockwise or counter-clockwise. By detecting the order of change one can tell whether the wheels of the robot are spinning forward or backward. D. DaNI 2.0 Robotic Platform DaNI 2.0 is an out-of-the-box mobile robot platform with sensors, motors and an NI 9632 Single-Board Reconfigurable Fig. 8: Waveform output of the two sensors[7] Fig. 9: DaNI 2.0 Robot[8] I/O computer mounted on top of a Pitsco TETRIX erector robot base as shown in figure9.[8] The NI Single-Board RIO shown in figure 10, is an embed- ded deployment platform that integrates a real-time processor, reconfigurable field-programmable gate array (FPGA), and analog and digital I/O on a single board. This board is programmable with LabVIEW Real-Time, LabVIEW FPGA and LabVIEW Robotics software modules. The real-time processor runs the LabVIEW Real-Time Mod- ule on the Wind River VxWorks real-time operating system (RTOS) for extreme reliability and determinism. LabVIEW contains built-in drivers and APIs for handling DMA or inter- rupt request (IRQ) - based data transfer between the FPGA and real-time processor. The 2.0 starter kit includes ultrasonic and optical encoder sensors, but the reconfigurable I/O capability allows us to expand the kit to experiment with a variety of sensors including: LIDAR, Radar, Infrared, Compass, etc.[8] E. Steering Theory 1) Differential Drive: DaNI 2.0 uses a drive mechanism known as Differential drive. The robot has two drive wheels mounted on a common axis as shown in figure 11, and each wheel can independently be driven either forward or backward.
  • 4. Fig. 10: NI Single-Board RIO 9632[8] While the velocity of each wheel can vary, for the robot to perform rolling motion, the robot must rotate about a point that lies along its common left and right wheel axis. The point that robot rotates about is called ICC(Instantaneous Centre of Curvature). Fig. 11: Differential Drive Kinematics[9] By varying the velocities of the two wheels, the trajectories that the robot takes can be varied. Because the rate of rotation ω about the ICC must be the same for both wheels, we can write the following equations: ω(R+I/2) = Vr ω(R−I/2) = Vl where l is the distance between the centers of the two wheels, Vr , Vl are the right and left wheel velocities along the ground and R is the distance from the ICC to the midpoint between the wheel.[9] a) Forward Kinematics for Differential drive robots: In figure 11, assume the robot is at some positon (x,y), headed in a direction making an angle θ with the X-axis. We assume the robot is centered at a point midway along the wheel axle. By manipulating the control parameters Vl, Vr(wheel velocities along the ground) we can get the robot to move to different positions and orientations. Knowing velocities Vl, Vr we can find the ICC location: ICC = [xRsin(θ),y+Rcos(θ)] and at time t +δt the robots pose will be: Fig. 12: Position of robot Fig. 13: Forward Kinematics for Differential Drive[9] From figure 13 we can say that the motion of the robot is equivalent to 1) translating the ICC to the origin of the coordinate system, 2) rotating about the origin by an angular amount ωδt, and 3) translating back to the ICC. 2) Ackermann Steering: The ATV uses Ackermann steering drive. The steering angle in terms of traditional 4-wheel vehicle dynamics does not apply to either the angle of the right or left wheel, nor to the angle of the steering wheel/handlebar column relative to the centerline of the vehicle. Instead, it is referred to as the Ackerman steering angle which is the average of the two steered wheels angle to vehicle centerline. This is because the two steered wheels point at slightly different angles, to avoid the tyres to slip when turning in a circle as shown in the figures 14: Ackerman Angle: δ = (δ0 +δi)/2 Turning Radius: R = L/δ Where: δ0 = the angle of the outside wheel δi = the angle of the inside steered wheel L = the axle to axle wheelbase t = the track width (center of tire to center of tire)
  • 5. Fig. 14: Ackermann Steering Profile[10] III. METHODOLOGY A. Current ATV Modules and Components From previous R&D the ATV is currently controlled via a RC remote controller operates by integrating various modules CAN enabled modules together. Each module is built with different hardware components that easily send and decipher CAN messages. The following section explains the roles of various modules and the hardware components used in them. 1) Remote Control: A wireless remote control system is used for this project. The Spektrum DX6 transmitter and BR6000 receiver are used to control the ATV. These are shown in Figures 15 and 16. Since there is a transmitter-receiver relationship of remote control, the control mechanism is the same as the control signals for servos. Fig. 15: The Spektrum DX6 Remote[11] Fig. 16: The Spektrum BR6000 Receiver[11] To interpret these signals, pulse measurement techniques were utilized on four output channels of the receiver. The receiver was powered by the same 5 volt regulator that powered the Renesas Rx63N board. 2) Renesas Rx63N microcontroller: The Renesas RX63N development board shown in figure 17 provides a useful platform for evaluating the Renesas suite of development tools for coding and debugging, using the High-performance Embedded Workshop (HEW) IDE as well as programming the device using the on-board SEGGER J-Link JTAG debugger. It operates at up to 100 MHz and has a built-in flash. A broad memory lineup from 2 MB to 256 KB built-in Flash (+ ROMless) to 256 KB to 64 KB built-in RAM is available.[13] The Rx63N is mainly used to utilize the CAN transceiver Fig. 17: Renesas Rx63N microcontroller[12] module, due to the fact that it has a CAN API. This means it can accept analog values from either ROS node or the remote control and translate that into CAN H and CAN L messages and send it in the Data frame. 3) Brake and Throttle Module: The Brake and throttle module shown in figure 18 is responsible for the operation of the actuator used for the braking mechanism and servo motor that facilitates the locomotion of the ATV. Fig. 18: Brake and Throttle Module a) Linear Actuator: The Duff-Norton linear actuator as shown in figure 19 features load capacities ranging from 27 pounds to 25 Tons, gear or belt driven, acme screw and ball screw systems with various AC and DC input voltages. It is specially designed for a variety of commercial and industrial applications. Applications include gates, dampers, oven and processing tank doors, antennas, ergonomic furniture, and agricultural equipment,[15] etc. The ATV uses the actuator to facilitate braking mechanism with varying stroke length.
  • 6. Fig. 19: Duff Norton Linear Actuator[14] b) Throttle Servo: As with many vehicles, the throttle on the Honda ATV has a spring return. When the throttle is not engaged, it returns to an idle position. When the throttle is in an idle position, the vehicle does not begin to move or will come to a slow stop. This is advantageous because the brake does not need to be applied in most operating conditions, and even downhill slopes may only require braking to come to a complete stop.[11] A standard servo, shown in Figure 20, from parallax was selected to control the throttle. Fig. 20: Parallax Standard Servo[11] Fig. 21: Servo attached to the throttle assembly [11] Figure 21 shows the servo attached to the throttle assembly. Note that the throttle can be manually controlled when the servo is not powered, which allows a rider to turn off the control circuits and drive the ATV manually. c) CAN Transceiver: The SN65HVD251 CAN transceiver shown in figure 22 is intended for use in applications employing the CAN serial communication physical layer in accordance with the ISO 11898 Standard. It provides differential transmit capability to the bus and differential receive capability to a CAN controller at speeds up to 1 megabits per second (Mbps). Fig. 22: SN65HVD251 CAN Transceiver[16] Designed for operation in harsh environments, the device features cross-wire, over-voltage and loss of ground protection to 36 V. The common-mode range is -7 V to 12 V, and has a tolerance to transients of ±200 V.[17] The transceiver interfaces the single-ended CAN controller at the Brake and Throttle node with the differential CAN bus connecting di- rectly with a CAN bus module in Renesas Rx63N. d) Logic Level Converter: The SparkFun bi-directional logic level converter shown in figure 23 is a small device that safely steps down 5V signals to 3.3V AND steps up 3.3V to 5V at the same time.[18] The voltages can be set high and low and step up and down between them safely on the same channel. Fig. 23: SparkFun Logic level converter[18] The board needs to be powered from the two voltages sources (high voltage and low voltage). The Logic level shifter is used to compensate the difference in operating voltages of the Servo and the Sakura board which are 5V and 3.3V respectively. e) Renesas Sakura board: SAKURA board (Arduino compatible) is based on Renesas RX63N series 32-bit mi- crocontroller. It has on-chip flash memory and enhanced communication functions, including an Ethernet controller and USB 2.0 Host/Function. Fig. 24: SAKURA Board[19] The Renesas SAKURA board shown in figure 24 can be programmed with the Cloud base compiler supported by Renesas.[20] Detailed Technical specifications of the Sakura are as follows: • Microcontroller : RX63N(R5F563NBDDFP) • Operating Voltage : 3.3V • Clock Speed : 96MHz • Digital I/O Pins : 55 • Analog Input Pins : 16 • Flash Memory : 1MB • RAM : 128KB • USB- Function : (mini-B) The Sakura board is used because it has the CAN API for transmitting the CAN bus messages. The Sakura board provides both the CAN API as well as compatibility with
  • 7. Arduino. It receives messages from the servo via level con- verter and transmits it to the motor controller connected to the braking actuator. 4) Steering Module: The ATV is equipped with an elec- tronic power steering (EPS) module, as shown in Figures 25 and 26 which comprises of the Torque sensor (figure28), Power steering motor (figure27) and Steering angle encoder (figure29). Fig. 25: Steering Module Fig. 26: Electronic Power Steering [11] a) Power steering motor: The EPS provides power to the motor in response to a torque sensor attached to the steering column. As the rider turns the handle bars, torque is applied to the steering shaft, which is translated into a need for steering assistance. Fig. 27: Power Steering Motor [2] The DC resistance of the Power steering motor shown in figure 27 is approximately 0.33 ohms. The EPS can be fooled into thinking that the torque sensor was transmitting a signal. The EPS would then provide power to the steering motor; thereby eliminating the need to develop a control system for the motor. b) Torque Sensor: At the heart of the Electric Power Steering systems operation is the torque sensor shown in figure 28. A deformable element is installed between the operators handle bar and the steered wheels linkage. A resistive sensor array is placed at the interface of the two sections, and when a differential angular displacement is detected on the junction, a directionally dependent change in resistance is given as output by the sensor. The sensor comprises of two completely independent resis- tive elements - one for each direction (right/left). A third wire as shown in figure 28 is connected as a ground reference. c) Steering Angle Encoder: From the prior work on the steering, a US Digital MAE3 Digital Magnetic Encoder shown in figure 29 was affixed to the steering column to provide feedback for steering angle closed loop control. The encoder Fig. 28: Torque Sensor[21] employs absolute position encoding and a 10bit D/A converter, allowing it to essentially function as an analog potentiometer. In its connected configuration, the potentiometer has a 5V full scale range. Since not all of it is used because of the limited lock-to-lock steering angle so it is calibrated to output under 5V supply. Fig. 29: Steering Angle Encoder[21] The Encoder, as configured for Analog Output has a 10bit resolution. This yields a functional resolution of approximately 3.56 mV/bit. This is then imported into the Microcontroller via a 10bit A/D, preserving this resolution if interference is not considered. B. Proposed Components and Methods 1) National Instruments (NI) myRIO: NI myRIO shown in figure 30 places dual-core ARM Cortex-A9 real-time process- ing and Xilinx FPGA customizable I/O into a single module. RIO feature a processor and FPGA, both of which are fully programmable using NI LabVIEW software. The NI myRIO is used as it is programmable using LabVIEW Robotics toolkit. This toolkit has the steering frame which is used to set three different modes: User-defined, Differential and Ackermann. Figure 31 shows the Ackermann steering frame. The frame needs four input values namely, Wheel radius, Gear ratio, Wheel separation width and Wheel separation length. From these parameters program calculates the necessary kinematics needed to send proper signals to the motor controllers and other steering and drive related devices.
  • 8. Fig. 30: NI myRIO[22] Fig. 31: Ackermann Steering frame Wizard in LabVIEW Robotics 2) ROS Master: The ROS Master Node runs on a Linux Operating System (Ubuntu). The version of ROS incorporated for our research is ROS Indigo. The ROS Master provides naming and registration services to the rest of the nodes in the ROS system. It tracks publishers and subscribers to topics as well as services. The role of the Master is to enable ROS node, DaNI 2.0 to be able to communicate with the Master node. To start the Master node, ’roscore’ command is entered in the terminal. The ’roscore’ command does not take any arguments and the master should continue running for the entire time the ROS is used. The master can be stopped by typing Ctrl+C in its terminal. The ROS master generates a TCP port number as shown in figure 32. This port is used by the ROS node for communicating with the master. Now run the ’teleop twist keyboard’ command shown in figure 33 which is generic keyboard teleop for twist robots. This command enables us to accept real-time inputs from the keyboard and publish it to Twist. 3) ROS Node (DaNI 2.0): A small scale prototype of ATV, DaNI 2.0 was used to test the functioning of the system. DaNI 2.0 is programmable with LabVIEW Real-Time, LabVIEW FPGA and LabVIEW Robotics software modules which were used to publish and subscribe to odometry and twist commands information. Fig. 32: ROS Master running in a Linux environment (Ubuntu) Fig. 33: ROS Default Teleop Window used to Control the Subscribing Robot ROS plugin for LabVIEW as shown in figure 34 developed by TUFTS University. This plugin provides a list all the ROS Topic frames and functions for sending and receiving data from the subscribing robot to the publishing computer. The Teleop message packet shown in figure 35 is used to remotely operate the subscribing robot. The basic commands that are sent from the keyboard are used move or actuate the robots frame and motors. The ROS for LabVIEW toolkit enables the user to simply define what type of message will be published or subscribed to. In the example shown in Figure 36 the ROS message will publish Twist commands that are sent to the robot enabling it to move. The Twist command is part of the command velocity (cmd vel) parent topic. The Odometry message packet in figure 37 is used to send position and pose information to the subscribing laptop. The Odometry message enables ROS to track where the robot is in a space. The Odometry message is the second vital key in
  • 9. Fig. 34: ROS for LabVIEW Toolkit Fig. 35: Front panel that contains the structure of a Twist message, a node parameter, and error detection enabling a robotic platform to become a standard ROS base robot. The QRE on the ATV was essential to enable the correct feedback information to be passed to the ROS master node to calculate the present and future trajectories of the robot. The ROS for LabVIEW toolkit also enables the user to publish Odometry information back to the master node laptop. In the example shown in Figure 38 the ROS message will subscribe to the laptop and receive the Odometry information from the robot. This message is then parsed into its respective format for viewing by the user. The Odometry message is a navigation message that is part of the odom parent topic. Every item in the ROS Toolkit for LabVIEW is dependent on the toolkit finding and establishing communication with the ROS master node. In the screenshot in Figure 39 a prompt is shown to the user querying about the master IP address of the master node. It is also vital that the laptop and the wireless router on the robot be on the same local network. Due to the abstraction level of ROS the end user is left to determine how to interpret and distribute the Twist commands to enable the robot to move, stop, turn, etc. The robot in turn, must be able to publish the Odometry data structure back to the ROS master node. For the ATV, to best simulate an actual automobile the Twist commands and accompanying Odometry messages will be distributed using the CAN bus protocol. Each Fig. 36: The block diagram of publishing Twist commands to control the Robot Fig. 37: Front panel that contains a structure of Odometry message, a node parameter and error detection node takes the command received from the CAN bus and drives their respective components. This aids in simulating a similar setup in a full size vehicle. 4) Quadrature Rotary Encoders: A pair of Quadrature rotary encoders added on the rear wheels of the ATV for odometry measurements that are required by ROS to determine the Spatial coordinates of the ATV in space. The Encoder comprises of two sets of three Infrared proximity sensors - Sharp GP2D120, one on each of the rear wheels as shown in figure 40 and 41. The encoder disc has a minimum resolution of 11 degrees. a) Sharp GP2D120 IR Sensor: The GP2D120 in figure 42 is a distance measuring sensor with integrated signal processing and analog voltage output. Following are the technical specifications of the IR sensor: 1) Effective range: 4 to 30 cm 2) Output: Analog 3) Typical response time: 39 ms 4) Typical start up delay: 44 ms 5) Average current consumption: 33 mA It is composed of an integrated combination of PSD (posi- tion sensitive detector), IR-LED and signal processing circuit. The varying reflectivity of the object, the environmental tem-
  • 10. Fig. 38: Block diagram of a ROS subscription to Odometry messages to receive feedback from the Robot Fig. 39: Establishing the ROS Master IP in the LabView Main Front Panel perature and the operating duration do not easily influence the distance detection because of adopting the triangulation method. This device outputs the voltage corresponding to the detection distance. So this sensor can also be used as a proximity sensor.[23].The sensors are used to calculate the velocity and determine the direction of rotation of the wheel. The outermost sensor in figure 43 is used to count the number of pulses and determine the speed of the vehicle. For calculating the speed we use the formula: S = 2πr/t where, r = radius of the wheel t = time taken to complete one revolution The code in figure 44 shows that one revolution consists of 8 pulses. This is because the sensors used for the encoders have a specification that the width of each strip should be more than twice the width of the sensor itself. As we have 16 partitions on the encoder disc(8 black & 8 white) we get a resolution of 22.5 deg. But since we are using a quadrature encoder, we have set the middle ring to be 90 degree ahead of the outer ring in phase. By co-relating the values from both the outer and the middle sensor, we can reduce the resolution down to 11.25 deg. The outermost and middle sensors are used to determine the direction of the wheel. When the code as shown in figure 45 Fig. 40: QRE setup on the ATV with Marker for Rotation (Left-Side) Fig. 41: QRE setup on the ATV with Marker for Rotation (Right-Side) finds a low to high transition on the outermost sensor, it checks to see if the middle sensor is low or high then it displays the direction of rotation as Clockwise or Counter-clockwise respectively. The innermost sensor has two function: a) to mark the starting point for the outermost sensor to count the pulses and b) to determine the angle of rotation if the distance traversed by the wheel is less than 55 degrees. We used four pulses for the innermost sensor to reduce the margin of error in determining the starting point. If there was a single pulse, the error would be 349 degrees but with four pulses it decreases to 55 degrees. Figure 46 shows a code snippet for the angle measurement. We converted all the possible combination of starting and ending points within the 55 degrees sector between two consecutive innermost black stripes to corresponding decimal values. For example, if we consider the initial value as 0 and final value as 2 then this combination corresponds to the angular rotation of 33 degrees.
  • 11. Fig. 42: SHARP GP2D120 IR Sensor[23] Fig. 43: Three IR Sensors mounted on the ATV 5) Brake Actuator: Previous teams had programmed the brake actuator as shown in figure 47 to operate on two settings: full release and full brake. We added more resolution to the speed and stroke length of the actuator since ROS requires variable speeds of braking to perform various mapping and other maneuvers. Now the brake can be applied at different speeds for different road conditions. The table I shows the different stroke length for different conditions of braking. The first column shows the time required for application of brake corresponding to the road condition. IV. RESULTS A. Quadrature Rotary Encoder The graph in Figure 48 depicts the real time test of the QRE based system implemented on the All-Terrain Vehicle. The test was performed over a two minute window of time where the ATV was at an idle position for approximately the first minute of the test while the last minute of the test a gradual acceleration was applied to the throttle to produce a consistent acceleration. The QRE showed a speed of approximately two miles-per-hour (mph) at the idle state. As shown in the graph in Figure 48 when the ATV was accelerated to its maximum speed for first gear the QRE Time(sec) Stroke length(cm) Brake 1 4.1 Full Release 1.5 5.1 Quarter Brake 2 6.1 Half Brake 2.2 6.4 Three fourth quarter Brake 2.5 7.2 Full Brake TABLE I: Table showing the results for Resolution achieved in Braking mechanism Fig. 44: Code snippet-Calculation of the speed of ATV Fig. 45: Code snippet-Direction of rotation of the wheels showed a linear relationship with respect to the speed of the ATV. The ATV was returned to an idle state. This concluded the QRE test setup and data recording shown in figure 49.
  • 12. Fig. 46: Code snippet-Measurement of angle of rotation before the starting point Fig. 47: Setup of Actuator on ATV V. CONCLUSION The purpose of this research was to allow an existing RC controlled All-Terrain Vehicle to be ROS enabled. Previous work had enabled the robot to be successfully controlled via an RC controller and receiver. With the base robot working there was a need to add more sensors to the robot. The ROS framework was chosen for its open-source format and community development. ROS gives the user the ability to deploy elaborate localization and mapping techniques with little effort. Since ROS operates at a high level the user is left with the task of enabling the commands sent by the ROS master to be properly translated to the robots motors, actuators, etc. Likewise, the user must package the feedback from the wheels and other sensors in the exact data structure of the Odometry message. The novel QRE that was developed for the ATV enabled it to send back this information to the ROS master node. The additional resolution provided by the new braking algorithm provides a higher fidelity of movement to the braking actuator and thus results in a fine-grained control of the ATV. With the development of the QRE and addition of the brake motor resolution the ATV is now ready to be tested with any localization and mapping technique that the end users desires. Lastly, with the ROS framework, the user can add a multitude of off-the-shelf (OTS) sensors and components to the base ATV to even further development. Fig. 48: Graph of QRE Test Results on ATV Fig. 49: QRE Test Setup (Rear Wheels of the ATV Where Elevated as Shown) VI. FUTURE WORK With the base ROS framework now able to be deployed on the ATV future teams can begin to add more sensors to the robot. Mainly, the SICK LMS200 1D LIDAR will be added to enable the robots surrounds to be viewed in the native ROS RVIZ environment. The image in Figure 50 shows the currently mounted LMS200 LIDAR sensor on the ATV. The image in Figure 51 shows a screenshot of the RVIZ software that was used to map out an existing building. The ROS framework enables sensors, such as the LMS200 LIDAR, to be deployed on the ATV quickly. All the user must do is simply download the ROS driver for that sensor and it will automatically be ported into the robots underlying architecture. With the simplistic design of the QRE and the braking resolution provided with the braking algorithm, researches can easily port their particular robotic platform into a ROS enabled platform. The QRE and the braking resolution algorithm provides the necessary framework to, theoretically, make any current robotic platform ROS enabled. REFERENCES [1] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,” in ICRA workshop on open source software, vol. 3, no. 3.2, 2009, p. 5.
  • 13. Fig. 50: Currently mounted LMS200 LIDAR sensor Fig. 51: Screenshot of ROSs Native RVIZ Environment[24] [2] A. Cortner, J. M. Conrad, and N. A. BouSaba, “Autonomous all-terrain vehicle steering,” in Southeastcon, 2012 Proceedings of IEEE. IEEE, 2012, pp. 1–5. [3] 2016. [Online]. Available: http://www.ros.org/about-ros [4] 2016. [Online]. Available: http://wiki.ros.org/ROS/Introduction [5] 2016. [Online]. Available: https://en.wikipedia.org/wiki/CAN bus [6] S. K. R. Gurram, “Implementation of controller area network (can) bus in an autonomnous all-terrain vehicle,” Ph.D. dissertation, The University of North Carolina at Charlotte, 2011. [7] 2016. [Online]. Available: http://www.robotoid.com/appnotes/ circuits-quad-encoding.html [8] 2016. [Online]. Available: https://www.pitsco.com/About/?art=5000 [9] E. Lee, “The problem with threads,” Computer, vol. 39, no. 5, pp. 33–42, 2006. [10] 2016. [Online]. Available: http://www.diracdelta.co.uk/science/source/a/ c/ackermann%20steering/source.html#.Vx1OlzArLIU [11] R. A. McKinney, M. J. Zapata, J. M. Conrad, T. W. Meiswinkel, and S. Ahuja, “Components of an autonomous all-terrain vehicle,” in IEEE SoutheastCon 2010 (SoutheastCon), Proceedings of the. IEEE, 2010, pp. 416–419. [12] 2016. [Online]. Available: http://www.renesas.com/products/tools/ introductory evaluation tools/renesas demo kits/yrdkrx63n/index.jsp [13] 2016. [Online]. Available: http://www.am.renesas.com/products/tools/ introductory evaluation tools/renesas demo kits/yrdkrx63n/index.jsp [14] 2016. [Online]. Available: http://www.minutemancontrols.com/ duff-norton main.html [15] 2016. [Online]. Available: http://www.duffnorton.com/products.aspx? id=7841 [16] 2016. [Online]. Available: http://www.ti.com/product/SN65HVD230 [17] 2016. [Online]. Available: http://www.ti.com/product/SN65HVD251 [18] S. Bi-Directional, S. OpenLog, S. APDS-9960, X. (802.15.4), L. Analyzer, S. Breakout, and S. Breakout, “Sparkfun logic level converter - bi-directional - bob-12009 - sparkfun electronics,” 2016. [Online]. Available: https://www.sparkfun.com/products/12009 [19] 2016. [Online]. Available: http://sakuraboard.net/gr-sakura en.html [20] 2016. [Online]. Available: http://www.rs-online.com/designspark/ electronics/nodes/view/type:design-centre/slug:GR-Sakura [21] J. R. Henderson, J. M. Conrad, and C. Pavlich, “Using a can bus for control of an all-terrain vehicle,” in SOUTHEASTCON 2014, IEEE. IEEE, 2014, pp. 1–5. [22] 2016. [Online]. Available: http://www.ni.com/myrio/what-is/ [23] 2016. [Online]. Available: https://www.pololu.com/product/1136 [24] 2016. [Online]. Available: http://www.iroboapp.org/images/f/f8/Rviz2. png