1. The Future of Hauling - Autonomous Cargo
Transporter
Team Members: Guanqun Huang, Wei Hu, Muireann Spain,
Nathan Srinivasan, Mingguang Zhou
Advisors: Professor Francesco Borrelli and Xu Shen (PhD Student)
27th April 2020
4. List of Figures
1 Porter’s Five Forces Analysis Method . . . . . . . . . . . . . . 8
2 Robots in Amazon’s autonomous warehouse [18] . . . . . . . . 10
3 Organizational Map of the Autonomous Stack and responsible
parties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
4 Vestil EMHC-2860-1 heavy-load cart [4] . . . . . . . . . . . . 13
5 Disassembly of cart for motor controller access . . . . . . . . . 14
6 Wires and components under the knob cover . . . . . . . . . . 15
7 Hand sketch of the circuity of the cart . . . . . . . . . . . . . 16
8 Left to right kinematic diagram of steering linkage system
made with Solidworks . . . . . . . . . . . . . . . . . . . . . . 18
9 Mounted Actuator Steering System . . . . . . . . . . . . . . . 19
10 Finished and mounted sensor rack . . . . . . . . . . . . . . . . 20
11 Controller setup for manual control . . . . . . . . . . . . . . . 21
12 Manually driving the cart in the parking lot . . . . . . . . . . 22
3
5. 13 The experiential linear relationship between wheel speed and
voltage input. (“index” here is the digital representation of
the input voltage, for which 0 index equals to 0 [V] and 128
index equals to 5 [V]. . . . . . . . . . . . . . . . . . . . . . . . 24
14 The block diagram of the feed-forward PI controller for the
propulsion control . . . . . . . . . . . . . . . . . . . . . . . . . 25
15 The simulated cart velocity with the PI controller . . . . . . . 25
16 Mapping Algorithm Pipeline . . . . . . . . . . . . . . . . . . . 27
17 Mapping Data Collected From the Lower Hearst Parking Garage 28
18 Localization Algorithm Pipeline . . . . . . . . . . . . . . . . . 29
19 Localization Result . . . . . . . . . . . . . . . . . . . . . . . . 30
20 Simple Ellipse Route Map (Red Points) shown on 2-D Point
Cloud Static Obstacle Map (Blue) . . . . . . . . . . . . . . . . 31
21 Simple Parking Maneuver Route Map (Red Points) shown on
2-D Point Cloud Static Obstacle Map (Blue) . . . . . . . . . . 32
22 Figure 14: Route Map from ROS Bag (Red Points) shown on
2-D Point Cloud Static Obstacle Map (Blue) . . . . . . . . . . 33
23 Nonlinear Bicycle Model . . . . . . . . . . . . . . . . . . . . . 34
24 Comparison of the actual vehicle state versus the reference value 36
25 2-D Simulation of cart model following predefined waypoints
(black) in point cloud map (grey) . . . . . . . . . . . . . . . . 37
26 MEng Student Team with Finished Hardware Platform . . . . 42
27 Curtis 1212P Circuit Diagram . . . . . . . . . . . . . . . . . . 43
28 Equivalent Stress with minimal stress points along chamfered
corners of component . . . . . . . . . . . . . . . . . . . . . . . 44
4
6. Executive Summary
When humans complete repetitive tasks that involve moving heavy loads in
a dynamic and complex environment, they can fail to perceive critical as-
pects of their immediate surroundings. This can become dangerous for other
human operators in the area. An autonomous cargo transporter would help
to reduce potential accidents when the inability to carry out repetitive tasks
becomes prominent. Improvements in efficiency and safety can be considered
as highlights of developing such a platform. The overall aim of this project
was to deploy an autonomous electric cart with the ability of traveling at 0.5
m/s while transporting a load of 1500 lbs from an initial starting point to a
desired end point.
Within the specified project timeline of nine months, the team was able
to retrofit an electric heavy-load cart normally operated by a human with
sensors and mechanisms in order for it to maneuver and transport cargo
autonomously. This load cart will reduce potential human error when trans-
porting cargo. This report provides the analysis of the development of an
autonomous mobility solution dealing with cargo transporting. The analysis
will focus on the marketing aspect of the product as well as the technological
development.
Section 1 elucidates the economic, technological and social aspects, in-
cluding how various stakeholders and competitors are affected by the devel-
opment of the product. Section 2 covers the technical features, including the
software development, hardware development, and the combination of these
two making the product.
In the case of this team, this project could be further improved if provided
with an elongated time frame. Improvements in the hardware system and
time spent testing how robust the overall system is would be prioritized.
5
7. 1 Business Outcome
1.1 Target Market
The first-stage target market is luggage-transporting vehicles in airports.
Every airport needs a set of ground support equipment systems to help them
with tasks such as refueling, cargo loading and passenger moving. Many
airports rely on human operators to complete these sorts of repetitive tasks.
However, it can be seen that having a human driver can result in two distinct
problems. Firstly, the labor cost is high, especially with extra compensation
airports must offer workers for the irregular schedule that typically functions
around the clock. Secondly, human drivers are less productive with this kind
of work cycle, hence the different shifts that are run by other people.
In order to keep up with the demand of ensuring that thousands of flights
successfully depart, the human task force is quite large to maintain airport
regularity [5]. Human operators face exhaustion after working for several
hours, which can lead to inefficiencies doing a task during each shift.
To lower the cost and improve the efficiency of the airport ground sup-
port, operation strategies need to be optimized by utilizing engineering tools
[1]. Automation of operations has huge potential to upgrade and main-
tain the ground support. An experiment conducted by Frank, Sebastian,
Schachtebeck and Hecker [6] has already demonstrated the feasibility of us-
ing automated towing vehicles to improve airport efficiency.
The team believes the development of an automated cart could further
progress airport ground support by specifically executing the luggage trans-
portation autonomously. The important features this cart would need in-
clude: an optimal path planner, real-time obstacle avoidance, heavy load
capacity and robustness. On the ground, this cart would be able to move
from an initial start point to a defined end point while following the optimal
path and carrying any necessary luggage or equipment. This resulted in the
defined cargo load capacity of 1500 lb. While navigating along the path, the
cart will have to constantly scan its environment to safely avoid or stop for
people, debris, and other vehicles.
6
8. In addition to airport ground support, autonomous heavy load vehicles
have many possible applications in different fields such as warehousing oper-
ations, retail operations, industrial manufacturing, hospitals and healthcare
service, hotels and resorts, food and beverage delivery, educational institu-
tions, railroad and so forth [8]. This extends the potential market for this sort
of platform, leading to further economic success when selling to customers.
1.2 Five-Force Industry Analysis Method
To justify the potential of the product, we analyzed its advantages and dis-
advantages regarding the industrial value chain. The five-force analysis, as
demonstrated in Figure 1, is a method that is widely used among investors
to understand the business models of certain companies when making invest-
ment decisions.
7
9. Figure 1: Porter’s Five Forces Analysis Method
The bargaining power of customers is relatively low. This cart could
reduce the customer’s operating cost significantly. While the upfront cost
of the cart would be around $ 15,000 after the research and development
phase is completed, minimal maintenance on the cart is expected during
the operating lifetime to ensure the system remains functional. To recruit a
human, the cost would be approximately $ 8,000 a month. Our autonomous
transporter would definitely be a cost effective improvement for every airport.
There are not many alternatives in the market, unless the customer requires
a full size autonomous vehicle [2].
The bargaining power of the supplier is high. This transporter faces a
8
10. similar problem to the autonomous industry as a whole - the high cost of
LiDAR. LiDAR is a surveying method that measures distance to a target by
illuminating the target with laser light and measuring the reflected light with
a sensor. In our system, LiDAR is used to build the three-dimensional point
cloud map and detect any obstacle around the cart. We believe, however,
that the cost is still tolerable, considering the target market. If the cost of
LiDAR is so high that it hinders sales, we could develop a radar, camera-
based computer vision system such as Tesla has in their autopilot system
[16].
The threat of new entrants is high because autonomous transportation
industry is a promising field, as it has advantages such as high investment
interest, broad applications and a large potential to scale. The biggest barrier
would be talent. These companies have to scramble to hire professionals while
competing with big technical companies such as Google and Amazon.
The threat of substitutes is low. Our self-navigation cart is a better
solution for airport ground support operation than the current manual cart,
and currently there is no other possible solution in this scenario.
The threat of rivalry is low for this product. As this is a relatively new
and undefined market, the possibility of rivalry needs to be considered. This
threat may come from manually driven electric cart companies or big tech-
nical companies. Amazon has its autonomous warehouse robot already de-
veloped, but these kinds of robots are catering more to the integration of the
e-commerce industry. These factory robots are shown in Figure 2 below. The
threat from existing electric vehicle companies is also low, as their products
are too large for the indoor dynamic environment.
9
12. 2 Technical Outcome
2.1 Technological Summary
When considering the marketing aspects of these projects, the team needed
to develop an autonomous platform that would potentially be the solution
for the market gap. In order to develop a full stack solution, the team
referenced various autonomous vehicle companies, such as Kodiak Robotics,
Waymo and Cruise. These companies did not develop the complete platform
from the ground up, but rather installed the necessary features that enabled
the platform to become autonomous. Thus, the team decided to follow their
methods of retrofitting an existing vehicle hardware platform, so that the
team could focus more on autonomous stack development for the scope of the
project. The first few weeks at the start of the Fall 2019 semester was spent
researching various platforms. The details are shown in the next subsection.
In order to incorporate the software stack with the hardware, the team
developed a flowchart shown below in Figure 3. The flowchart showcases the
communication between various high level software sections all the way down
to the motors on the physical platform. This report will discuss in further
detail the technological development and of both the hardware and software
subsystems used in this project. Because the team does not have access to an
airport, the Lower Hearst parking lot situated on the UC Berkeley Campus
was used for testing.
11
13. Figure 3: Organizational Map of the Autonomous Stack and responsible
parties
2.2 Hardware Implementation
2.2.1 Hardware Platform Acquisition
When searching for an adequate platform there were a variety of fully equipped
platforms on the market with some level of autonomous capability. However,
most did not have the desired load capacity of 1500 lbs. The team required
a platform which had room to be furnished desired sensors, computers and
other necessary items safely without interfering with the loading capability.
This led the team to the Vestil EMHC-2860-1 seen in Figure 4 below.This
fit all the criteria for the team including capacity, power, and the ability to
attach and modify items on it. The defined use case environments are defined
in Section 1.
12
14. Figure 4: Vestil EMHC-2860-1 heavy-load cart [4]
2.2.2 Motion Execution: Propulsion Design
The propulsion control of the platform needed to be changed in order to de-
velop a system that would allow for autonomous control. This new propulsion
system would allow for the precise execution of speed from the high-level
controller. The team decided to use Arduino to take digital inputs from
high-level software and override the original manual control with new output
signals.
From observations, the longitudinal movement of the cart was controlled
by a twisting knob at the center of the handle bar. Moving forward and back-
wards was done by rotating the knob in the same directions respectively; the
amount of knob rotation was proportional to cart velocity in either direc-
tion. It was interpreted that turning the knob was changing the voltage to
the motor controller, which would output voltage to the motor up to 24 volts.
Based on research, the motor controller on the car was a Curtis 1228-244.
The team was unable to see the physical motor controller until the cart was
taken apart.
To further understand the propulsion system, the team disassembled the
chassis to gain access to the controller. This is seen below in Figure 5 below.
13
15. Figure 5: Disassembly of cart for motor controller access
From there, it was found that the motor controller with the cart was
actually a Curtis 1212P, which was different from what was researched. For-
tunately, the Curtis 1212P seemed to be similar to the Curtis 1228-2441
found online; the Curtis 1212P manual is found in Appendix A as Figure 27.
Referring to that diagram, the team figured out how the knob component
(Figure 6), which was very similar to the Curtis ET-190 MCU, was wired to
the rest of the cart. Several rounds of tests were conducted including volt-
age and signal measurements, Finally, the circuit diagram of the cart was
confirmed, as shown in Figure 7.
14
17. Figure 7: Hand sketch of the circuity of the cart
Figure 7 illustrates the ”Pot Wiper” connection point, which takes in
a voltage input between 0-5 which is then converted to 0-24 V output to
the motor. This also controls the locking and unlocking of the motor brake
- the motor brake is locked if input into the Pot Wiper is less than 1 V.
Simultaneously, it was discovered that the Neutral line regulated the direction
of the cart when enabled. When the 1212P controller receives 24 V from the
Neutral line, the cart would drive in our defined forward direction, which is
16
18. towards the handle bar of the cart. On the other hand, when the input to
the Neutral line is 0 V, it would reverse.
In order to achieve digital control via the Arduino, the team utilized a
relay and digital potentiometer. The JZC-11F relay was chosen to connect
or disconnect the Neutral line with the 24 volt source line of the battery, so
that the Arduino could control the direction of the motor. The MCP Digital
Potentiometer was chosen to output 0-5 volt with a digital input; its output
voltage would go into the Pot Wiper and thus control the cart speed.
On the algorithm side, the Arduino interface takes in a digital input of
x and y, and outputs the longitudinal movement to the cart. Here “x” is
an integer between 0 and 128 that linearly outputs voltage between 0 and
5 volts. The input “y” is either 0 or 1, “0” being that the relay is open so
the cart drives in reverse, and “1” being that the relay is closed so the cart
drives forward. The Arduino was also used for the steering system, and the
operation of the Arduino will be discussed later in the report.
2.2.3 Motion Execution: Steering Design
When the cart was first acquired, the method of steering was done manually
by the operator. The open-differential based propulsion system allowed it
to rotate about itself. Therefore, a steering design had to be implemented
within the given packaging, loading, and environmental constraints in order
for the cart to operate autonomously.
All the designs took into consideration the available space on the cart.
The first considered design included a braked-based torque vectoring system,
where the inside wheel would be braked via linear actuator powered calipers.
This was eventually discarded due to complexity and cost. The next and
final design that was produced utilized turning the front caster wheels like
in a real car with a linear actuator.
The first consideration was finding the right linear actuator to provide
enough dynamic loading force to create a torque to turn the wheels. The
next step was to design the linkage system around the actuator using kine-
matic analysis. Figure 8 shows all the fixed and mobile link lengths. An
17
19. important design consideration is that the kinematics of this design means
there is some “slip” in the outside wheel when turning, meaning there is
extra friction on the outside wheel when compared to the radius of intended
travel. In a normal car the outside wheel is at a slightly different angle than
the inside turning wheel to prevent slip; in the team’s case, this was a nec-
essary compromise that did not really affect how the cart handled as it was
constrained to low speed.
Figure 8: Left to right kinematic diagram of steering linkage system made
with Solidworks
Once the kinematic analysis was completed, a simple Finite Element
Analysis (FEA) was done using Ansys to determine the failure points of
the system and optimize those areas. The material chosen was mild steel as
it was readily available and met stress requirements as shown by the analysis
while being relatively cheap. Basic mechanical analysis showed that corners
and the mounting hole were places that would be most affected by the ap-
plied load by the linear actuator. The final results of the analysis are shown
in Figure 28 in Appendix A.
The links were then waterjetted in the on campus machine shop in Etchev-
erry Hall. After the links were installed onto the cart, the next step was to
determine where to place the other end of the actuator where it would just
rotate. This step took a significant amount of time and design consideration
to ensure there would be no interference with other parts and that the system
would operate as kinematically intended. A mild steel L-beam was used to
fixture the actuator under the vehicle. The mounted actuator is shown in
the Figure 9 below.
18
20. Figure 9: Mounted Actuator Steering System
2.2.4 Sensor Rack Design & Fabrication
In order to affix the LiDAR and electrical equipment to the vehicle, a rigid
structure was installed on the cart. The sensor rack is placed 1.5 m above
the ground as based on the desired viewing angle for the LiDAR. Using t-slot
railing allows for simple fixturing of brackets and other miscellaneous compo-
nents. The sensor rack is bolted onto the cart floor with L-brackets. Circuit
boards, microcontrollers, and power supplies are situated in this structure to
minimize space usage. The finalized sensor rack is shown in Figure 10 below.
19
22. 2.3 Software Implementation
2.3.1 Low-level Control
Open-loop Control
A necessary step in autonomous development is collecting a map of the
environment of operation, which was defined in this case as Lower Hearst
parking garage. The LiDAR was used to create a 3D point cloud map, but
could not be obstructed, so an interface was developed where a person could
manually drive the cart without interfering with data collection. In reference
to the Arduino mentioned earlier, a Gamepad Joystick Shield was combined
with the Arduino Uno board and existing digital input interface so that we
could use the joystick and the buttons to control the movement of the cart.
The setup is shown below in Figure 11.
Figure 11: Controller setup for manual control
When the joystick is pushed forwards, the cart moves in the positive
direction. The speed increases as the joystick is pushed further forward, and
vice versa if reversing. There was a software speed limit implemented for
the safe manual operation. Another software safety design was that if the
21
23. joystick was accidentally moved suddenly or not touched, the cart would not
move; only subtle movements would drive the cart. An automatic slow-down
function was also added if the joystick suddenly returns to neutral position
to prevent sudden stops.
In the same Gamepad setup, the two yellow buttons were utilized for
turning the cart left and right with the linear actuator steering system. One
software-based safety design included a prevention of both buttons being
pressed at once and a hard limit on how far the actuator would extend,
to avoid damage to the actuator and steering mechanism. At the end of
semester Fall 2019, the team successfully used the joystick to control the
cart and drive in the Lower Hearst parking lot. Manual operation is shown
in Figure 12 below.
Figure 12: Manually driving the cart in the parking lot
22
24. Closed-loop Control
Once the team was able to operate the cart manually, the design of the
autonomous low-level control was implemented in a closed-loop design to
control both propulsion and steering. Instead of receiving inputs from the
Gamepad setup, the Arduino received the desired velocity and steering angle
of the cart from the Path Follower, which is discussed later in the report. The
Arduino would be able to adjust the velocity and steering angle to eliminate
errors, so that the cart could execute movement along the desired path with
minimum errors.
For the propulsion, we implemented a feed-forward PI (Proportional In-
tegral) controller to achieve the closed-loop velocity control. With the strong
damping effect of friction and weight of the system, we assumed that deriva-
tive control was not necessary. The voltage input had a linear relationship
with the output wheel speed, which was averaged from two driving wheels.
A Signwise 360P optical encoder was installed on either driving wheels, and
then interfaced with a 50Hz interrupt software system on the Arduino. The
“360P” represented data pulses back to the Arduino of 360 times per revolu-
tion, and combined with the 50 Hz gave precise velocity readings. To confirm
this linear relationship an experiment was conducted as indicated in Figure
13.
23
25. Figure 13: The experiential linear relationship between wheel speed and
voltage input. (“index” here is the digital representation of the input voltage,
for which 0 index equals to 0 [V] and 128 index equals to 5 [V].
When Arduino receives the desired velocity control, it refers to the linear
relationship described above to derive an ideal voltage input. The feed-
forward PI controller eliminates the error using the reading from the optical
encoders. Referring to Equation 1 (discrete time), the input utuned(k) tuned
by the wheel speed error would join the ideal input ¯u(k) [7]. The block
diagram in Figure 14 further elucidates this process.
u(k) = ¯u(k) + utuned(k) = kffvref + kpe(k) + ki
k
n=0
e(n) (1)
24
26. Figure 14: The block diagram of the feed-forward PI controller for the propul-
sion control
Manual tuning of the constants of the PI controller was then quickly
conducted with the cart on the ground of the second floor of Etcheverry
Hall, and the optimal results made it so that the cart could accelerate from
0 to 0.8 m/s on a smooth surface in about 1.5 seconds and run steadily with
that speed. To clearly demonstrate this result, a simulated cart velocity
based on measured data was created as in Figure 15.However, testing in the
parking garage was not conducted due to closure of the campus from the
COVID-19 pandemic.
Figure 15: The simulated cart velocity with the PI controller
The closed-loop control of the steering was a straightforward relationship
between the length the actuator would travel for steering angle, and a simple
digital input. The actuator has a built-in potentiometer that was related
25
27. to the length of the actuator stroke, but during testing, the analog poten-
tiometer values in relation to distance travelled was not linear. Therefore,
feedback was difficult because of the non-linearity. The non-linear system was
not accurate enough for the autonomous capability, and thus the same opti-
cal encoder used for the propulsion system with a gearbox was implemented
for testing. This design allowed for direct feedback of the final steering angle
in a linear fashion.
In order to incorporate with the high-level software, a software wrapper
known as ROS (Robotics Operating System) was used to connect all the
software systems. This is expanded as the flowchart in Figure 3. Because real-
time data was unavailable due to campus closure, the team used a recorded
ROS bag to send mock steering angles and velocity values to the Arduino.
Simulation results showed promise, but more development time was required
in order to successfully test this on the motors of the cart.
2.3.2 Mapping and Localization
Receiving Data from LiDAR
The mapping and localization algorithms relied on a 64-channel LiDAR
from Ouster. This powerful LiDAR was chosen because it can provide the
best resolution of the environment around the cart, and the stability is note-
worthy. The LiDAR is connected to a 12 V power supply and then connected
to the computer via an Ethernet port. This allows the transfer of data from
the LiDAR into the computer.
The current implementation for range image projection is only suitable
for sensors that have evenly distributed channels, and limits the choice of
hardware to those with a higher resolution. For such a projection, the team
wrote a custom algorithm to interface with the Ouster LiDAR. To integrate
the LiDAR with another package, the user just needs to call this launch
from the main launch file. This LiDAR requires some additional steps of
configuration. The readme file from the manufacturer package can be utilized
for this purpose.
26
28. Mapping
The input into the software system is the LiDAR Point Cloud and the
output is the global map which is sent to the global planner. An improved
distance accuracy of mapping from 92.1% to 97.9% was achieved using an
automatic algorithm ”LeGO-LOAM”. This optimizes LiDAR odometry and
for a horizontally placed LiDAR on the ground vehicle. The LiDAR pos-
tulates that there is always a ground plane in the scan. The LeGO-LOAM
GitHub repository for contains code for a lightweight and ground optimized
LiDAR odometry and LeGO-LOAM mapping system for ROS compatible
UGVs (Unmanned Ground Vehicles). The diagram of the algorithm is at-
tached in Figure 16.
Figure 16: Mapping Algorithm Pipeline
The algorithm consists of point cloud segmentation, feature extraction,
laser odometer, and laser mapping. It is somewhat similar to RGBD-SLAM
(RGBD means color and depth information from camera and SLAM means
simultaneous localization and mapping). The system receives 6-DOF atti-
tude estimation by receiving input from LiDAR.
The entire system is divided into four modules. The first is segmentation,
which projects a single scanned point cloud onto a fixed range of images for
segmentation, and the second sends the segmented point cloud to the feature
extraction module. The third module is where the LiDAR odometer uses
the features extracted to find the transformation of the robot pose during
the continuous scan. This information is ultimately used in the construc-
tion of the LiDAR Mode for the LiDAR. The fourth module is the result of
the attitude estimation consisting of the combination of laser radar mileage
measurement and mapping; this combination outputs the final attitude esti-
mation.
27
29. The system takes in point clouds originally from a Velodyne VLP-16
LiDAR placed horizontally and optional IMU (Inertial Measurement Unit)
data as inputs. For the team’s case, the software ”Robosense” allows usage
of the Ouster LiDAR by giving access to change parameters. The LiDAR
is providing the input sensor sources and odometry source values. Based on
those two data sources, the algorithm calculates the sensor transformation,
the pose estimate and maps the environment, known as “map server” on the
computer. The result is attached below in Figure 17. Based on this data,
the path planner can take decisions.
Figure 17: Mapping Data Collected From the Lower Hearst Parking Garage
Localization
The input of the localization node is the LiDAR Point Cloud and the
output is the localization of the car which is sent to the local path planner.
28
30. The localization precision improved from 0.333 m to 0.201 m. An Extended
Kalman Filter and Unscented Kalman Filter were used to estimate the state
of a moving object of interest with noisy LiDAR and radar measurements.
The architecture of the algorithm is demonstrated in Figure 18.
Figure 18: Localization Algorithm Pipeline
Here are the steps for the Matching Algorithm Pipeline:
1. The 3D space around the robot is subdivided regularly into cells with
constant width 0.1 m
2. For each cell, that contains at least three points, jump to Step 3, or
scan the next cell
3. Collect all 3D-Points Xi = 1, ..., n
4. Calculate the mean µ = 1
n i Xi
5. Calculate the covariance matrix = 1
n i(Xi − µ)(Xi − µ)
6. The probability T of measuring a sample at 3D-points X contained in
this cell is now modeled by the distribution N(µ, )
7. Transform the next consecutive frame using initial guess T.
8. Calculate the probability distribution function of 3D-points trans-
formed by T, using the model we built in the Step 6, using Equation 2.
p(x) =
1
(2π)
D
2 | |
exp(−
(x − µ)T −1
(x − µ)
2
) (2)
29
31. 9. Multiply all the probability distribution functions of all the 3D-points
in the second frame and optimize the value to minimum. The optimal is the
final result is shown in Figure 19.
The software is for real-time 3D localization using a standard LiDAR,
such as the Velodyne HDL32e or VLP16. This package performs Unscented
Kalman Filter-based pose estimation. It first estimates the sensor pose from
IMU data implemented on the LiDAR, then performs multi-threaded NDT
(normal distribution transform) scan matching between a global map point
cloud and input point clouds to correct the estimated pose. IMU-based pose
prediction is optional; if disabled, the system predicts the sensor pose with
the constant velocity model without IMU information.
The high-level software is based on the SLAM accuracy, which is for real-
time 6 DOF SLAM using a 3D LiDAR. It is based on 3D Graph SLAM with
NDT scan matching-based odometry estimation and loop detection. It also
supports several graph constraints, such as GPS, IMU acceleration acting as
a gravity vector, IMU orientation as a magnetic sensor, and floor plane which
is detected in a point cloud.
Figure 19: Localization Result
2.3.3 Path Planning
Both path planning and path following algorithms were developed to enable
movement of the autonomous electric cart. Path planning is applied once a
map of the environment is known, as discussed above, and is the process of
selecting a continuous path that will drive the cart from its initial to its goal
configuration [11].
30
32. Generally, the user will define an “optimality” condition such as short-
est Euclidean distance or shortest time. In this case, the global path was
manually defined due to time constraints.
The point cloud collected during the mapping section was segmented to
remove the floor and ceiling points, leaving only the static obstacles as shown
in Figures 20, 21, and 22. Once the coordinates of these fixed obstacles were
plotted, a predefined continuous global path was selected that avoided all
static obstacles.
Sample paths are illustrated in Figures 20 and 21. An ellipse was chosen to
provide preliminary waypoints as it avoided all static obstacles while allowing
more gentle turns which better suited the electric cart dynamics. Figure 21
depicts a parking maneuver that is constrained by the cart dynamics. The
parking maneuver illustrates an example of what could be expected of the
electric cart during its working lifetime while performing tasks such as loading
luggage onto airplanes.
Figure 20: Simple Ellipse Route Map (Red Points) shown on 2-D Point Cloud
Static Obstacle Map (Blue)
31
33. Figure 21: Simple Parking Maneuver Route Map (Red Points) shown on 2-D
Point Cloud Static Obstacle Map (Blue)
In order to test the accuracy of the localization and path planning al-
gorithms, a route map was initially defined as the precise route that was
taken during the mapping phase, as shown in Figure 22. This enabled the
algorithms to be tested as the path was simulated alongside the ROS bag of
the actual path that was taken, and the results were compared for validation
purposes.
32
34. Figure 22: Figure 14: Route Map from ROS Bag (Red Points) shown on 2-D
Point Cloud Static Obstacle Map (Blue)
2.3.4 Path Follower
As discussed in the prior section, the desired trajectory of the autonomous
cart was predefined. However, due to both dynamic obstacles that were not
specified on the static map along with the cart not moving in real time as
expected from simulation due to sensor mismatch and noise, a path follower
was necessary to ensure that the cart stayed as close to both the predefined
waypoints and reference velocity as possible [3].
33
35. Figure 23: Nonlinear Bicycle Model
Due to the low speed scenario, a simplified nonlinear kinematic bicycle
model was used to represent the dynamics of the electric cart, as shown in
Figure 23 and Eq. (3) - (6). Among the states, x and y are global coordinates
of center of mass, ψ is the global heading angle, and β is the angle of the
current velocity with respect to the longitudinal axis of the car. Among the
inputs, v is speed of the vehicle and δf is the steering angle of front wheel.
Parameter lf and lr are the length from front and rear wheels to center of
mass respectively, and ts is the sampling time.
x(k + 1) = x(k) + tsv(k) cos(ψ(k) + β(k)) (3)
y(k + 1) = y(k) + tsv(k) sin(ψ(k) + β(k)) (4)
ψ(k + 1) = ψ(k) + ts
v(k)
lr
sin(β(k)) (5)
β(k) = tan−1 lr
lr + lf
tan(δf ) (6)
where k is the time step.
We are also faced with the maximum range and rate of input variation
34
36. due to the actuator limitation:
vmin ≤ v(k) ≤ vmax (7)
δmin ≤ δ(k) ≤ δmax (8)
|v(k) − v(k − 1)| ≤ ∆max
v (9)
|δ(k) − δ(k − 1)| ≤ ∆max
δ (10)
The tracking reference is selected to be a point with a certain look-ahead
distance along the pre-defined path. Path following controller calculates the
optimal control output (in this case the velocity v(k) and the steering angle
δf (k)) at each time step to track the reference based on the real-time state
measurements. This control output is passed to the low level controller and
to drive the physical system [3].
We applied the Model Predictive Controller (MPC)[19]. At each time
step, a constrained finite time optimal control problem (CFTOC) with hori-
zon length N = 10 is solved with the constraints Eq.(3)-(10) and the cost
function J to minimize the Euclidean distance between current state z(k) =
[x(k), y(k), ψ(k)] and the reference zref(k) = [xref(k), yref(k), ψref(k)] , and
the amount of actuation:
J =
N−1
k=0
||z(k + 1) − zref(k + 1)||2
Q + ||u(k)||2
R (11)
with weight matrices Q and R.
Compactly, the full version of controller formulation is
min
z(1),...,z(N),u(0),...,u(N−1)
N−1
k=0
||z(k + 1) − zref(k + 1)||2
Q + ||u(k)||2
R (12)
s.t. z(k + 1) = f(z(k), u(k)), k = 0, . . . , N − 1 (13)
umin ≤ u(k) ≤ umax, k = 0, . . . , N − 1 (14)
|u(k) − u(k − 1)| ≤ ∆max
u , k = 1, . . . , N − 1 (15)
z(k = 0) = zt (16)
where the dynamics constraints Eq.(13) correspond to Eq.(3)-(6) and input
constraints Eq.(14)-(15) are elucidated in Eq.(7)-(10). The initial constraint
35
37. Eq.(16) makes sure that every CFTOC runs in closed loop with the latest
system state at time t.
The path follower was tested prior to deployment on the physical system
by a simulation. In this simulation, each waypoint’s coordinate was set to be
a reference value, while the simulated x and y results were set to be the actual
value. The actual x and y started from the first waypoint coordinate, and
the actual velocity started from zero. As Figure 24 shows, the actual speed
fluctuated around the reference value, and the actual x and y coordinate are
very close to the reference value, indicating that the path follower algorithm
is effective.
Figure 24: Comparison of the actual vehicle state versus the reference value
For better visualization effect, a simple vehicle with wheels was plotted
and its progress following the waypoints was observed, as seen in Figure 25.
36
38. In this case, the cart (pictured in cyan) is increased in size for visualization
purposes. The red wheels indicate the steering angle control output, and the
open loop prediction is illustrated in yellow.
Once the path following appeared accurate, the control output was to be
published via the ROS topic to the electric cart rather than to the simulation.
Unfortunately due to campus closure, this objective could not be achieved.
Figure 25: 2-D Simulation of cart model following predefined waypoints
(black) in point cloud map (grey)
37
39. 2.3.5 Vehicle Operation and Testing
The testing phases were split into 4 sections and details are discussed below.
Phase 1
In the beginning of the semester, the team set out to retrofit the custom
propulsion system via joystick control. This initial step was required to be
able to test the cart without the standard handles and provided a gateway
to autonomously control the cart velocity through digital inputs with the
Arduino. The next step was to test the implemented steering system once
all the hardware was installed. Because the Joystick controller had buttons
on it, the actuator could extend or retract depending on what button was
pressed. Testing of both systems were completed within the first several
months in the hallway outside of MPC Lab, as a person would sit on the cart
and maneuver it with the joystick.
Phase 2
After initially testing the cart inside, the team decided to test in the Lower
Hearst Garage; this environment was the closest environment for the team
to an airport with ”vehicles” and people around. The Ouster LiDAR was
mounted for this occasion to collect the map of the floor. A couple loops were
driven on a single floor which gave a solid test for the structural rigidity of the
sensor rack and the steering system as well as the resolution of the LiDAR
in this specific environment. Because of the designed tolerances within the
steering system and rigidity when manufacturing the sensor rack, any minor
imperfections did not affect the the systems when testing.
Phase 3
Phase 3 included testing all of the software in simulation. As discussed
earlier in the report, a visualization was created that showed the cart moving
in real time with the open loop prediction and the wheel direction to illustrate
the control input. The waypoints used for testing were along the path that
the cart took during mapping in Phase 2. This enabled use of the software
stack to simulate the cart moving ”live” alongside the ROS bag that was
38
40. collected during Phase 2, and the software output was compared to the real
results. Luckily the path following and localization functioned as expected
together, and it was decided to proceed to testing system integration.
Phase 4
The intention of the final phase of testing was to encapsulate the software
stack and the hardware to test the functionality of the autonomous system.
Part of this was to include the fine tuning of the low level control for both
the throttle and steering. The team were confident in their software val-
idation, and felt that it was functional enough to deploy the electric cart
autonomously in Lower Hearst parking lot under supervised conditions.
However, the merging of the software stack and the physical system was
not achieved due to campus closure in the last months. It is expected that the
incorporation of real, noisy measurements will result in unforeseen difficul-
ties, and that many parameters within the algorithms will require significant
tuning. This final testing phase has been left for further work.
39
41. 3 Conclusion
Within nine months, this team of five along with two advisors were able
to complete a technological study potentially solving the autonomy prob-
lem within indoor environments, such as the airport scenario mentioned in
Section 1. The initial phase of the project started with the acquisition of
a proper platform which could be retrofitted for autonomous development.
The mechanical team developed a custom steering and propulsion system
that would allow manual operation of the cart and lay a foundation for au-
tonomous control rather than manual human operator control.
The software team developed algorithms that collected a point cloud map
from Lower Heart parking lot, used the map to generate a predefined route,
and used localization and path following algorithms to output control values
to the motors of the cart to enable it to function entirely autonomously.
This systems contains all of the aspects of the software stack that are cur-
rently used by commercial autonomous vehicle manufacturers such as Zoox,
Cruise and Waymo, albeit a simplified version. One aspect that could be
added to our system would be robust object detection - an attempt was
made to implement a rudimentary system, but unfortunately it was beyond
the scope of this project. In today’s environment, increasing safety is one
of the most critical functions of an autonomous vehicle, and robust object
detection would be necessary before deployment into the real world.
This platform can be used for many other things than just hauling, as
the flat cargo area allows for a plethora of installations for it’s final use
case. While the team is unable to continue development, they are proud of
developing an end-to-end product within the short project time frame of nine
months.
40
42. 4 Future Work
As with many projects, there are compromises that need to be accepted in
order to meet a minimum viable product to show proof of concept. In the
case of the team’s capstone, as well as many within the MEng program, a
lot of progress was cut short due to COVID-19. The lack of access to our
platform prevented us from testing more autonomous software. If the team
was able to work on the hardware in-person, further development would be
done on testing the optical encoder on the steering system.
An entire electrical system that would allow for seamless switching be-
tween manual and the autonomous mode was being developed, but needed
a lot of labor-intensive work to be completed. Implementation of a couple
of kill switches would be important for functional safety when the platform
is in operation. In speaking of further work for the hardware side, a series
of systematic tuning is necessary for both the longitudinal and steering feed-
back. Specifically, the cart should be guaranteed to move with stability and
accuracy in the parking lot that we test it.
As for the software side, a more elegant method of global path planning
would be implemented, potentially using the A* algorithm. The manually
selected waypoints functioned perfectly for proof of concept purposes, but us-
ing a robust algorithm would increase the transferability of this autonomous
system.
Future implementation would also include testing the software stack on
the physical cart. Final results of the software can be shown in simulation
luckily, but have not been tested on the cart itself. It was expected due to
sensor noise and the use of an idealized bicycle model that parameters within
the high-level control would require significant testing and tuning prior to
mass deployment.
As mentioned above in the conclusion, a more extensive object detection
algorithm would be extremely beneficial to increase the market potential of
the product and to enable its transition to fully autonomous. The initial
design for the cart included Velodyne LiDAR on each of the four corners
of the cart to detect objects at low elevations in close proximity such as
small children. Given that safety is a critical feature that many autonomous
41
43. vehicle companies such as Tesla have come under scrutiny for, it is necessary
to ensure that this feature is fully developed and tested prior to market
release.
A GitHub link is in Appendix A that houses all the project code and
further project information. An image of the MEng student team that de-
veloped this product is shown below in Figure 26.
Figure 26: MEng Student Team with Finished Hardware Platform
42
44. 5 Appendix A
Link to code repository: GitHub
Figure 27: Curtis 1212P Circuit Diagram
43
45. Figure 28: Equivalent Stress with minimal stress points along chamfered
corners of component
44
46. Bibliography
[1] Iyad Alomar and Jurijs Tolujevs. Optimization of ground vehicles move-
ment on the aerodrome. Transportation Research Procedia, 24:58–64,
2017.
[2] Richard Bishop. Outrider-your path to autonomous yards.
https://www.outrider.ai/system/, accessed 23.04.2020.
[3] Ashwin Carvalho, St´ephanie Lef´evre, Georg Schildbach, Jason Kong,
and Francesco Borrelli. Automated driving: The role of forecasts and
uncertainty - A control perspective. European Journal of Control, 24:14–
32, 2015.
[4] Vestil Manufacturing Corp. Electric material handling cart.
https://www.vestil.com/product.php?FID=1022, accessed 23.04.2020.
[5] Wen Bo Du, Zhi Xi Wu, and Kai Quan Cai. Effective usage of shortest
paths promotes transportation efficiency on scale-free networks. Physica
A: Statistical Mechanics and its Applications, 392(17):3505–3512, 2013.
[6] Sebastian Frank, Per Martin Schachtebeck, and Peter Hecker. Sensor
concept for highly-automated airport tugs for reduced emisson taxi op-
erations. 30th Congress of the International Council of the Aeronautical
Sciences, ICAS 2016, pages 1–9, 2016.
[7] Desborough Honeywell. Chapter 10: Pid control.
[8] Steve Jurvetson. Motrec market-specific solutions.
https://www.motrec.com/markets/, accessed 23.04.2020.
45
47. [9] Shinpei Kato, Eijiro Takeuchi, Yoshio Ishiguro, Yoshiki Ninomiya,
Kazuya Takeda, and Tsuyoshi Hamada. An open approach to au-
tonomous vehicles. IEEE Micro, 35(6):60–68, 2015.
[10] Jelena Kocic, Nenad Jovicic, and Vujo Drndarevic. Sensors and Sensor
Fusion in Autonomous Vehicles. 2018 26th Telecommunications Forum,
TELFOR 2018 - Proceedings, pages 1–4, 2018.
[11] Jason Kong, Mark Pfeiffer, Georg Schildbach, and Francesco Borrelli.
Kinematic and dynamic vehicle models for autonomous driving con-
trol design. IEEE Intelligent Vehicles Symposium, Proceedings, 2015-
Augus(Iv):1094–1099, 2015.
[12] Brian Paden, Michal ˇC´ap, Sze Zheng Yong, Dmitry Yershov, and Emilio
Frazzoli. A survey of motion planning and control techniques for self-
driving urban vehicles. IEEE Transactions on Intelligent Vehicles,
1(1):33–55, 2016.
[13] Fabio Real, Anas Batou, Thiago Ritto, and Christophe Desceliers.
Stochastic modeling for hysteretic bit–rock interaction of a drill
string under torsional vibrations. Journal of Vibration and Control,
(March):107754631982824, 2019.
[14] Rafael Rey, Marco Corzetto, Jose Antonio Cobano, Luis Merino, and
Fernando Caballero. Human-robot co-working system for warehouse
automation. IEEE International Conference on Emerging Technologies
and Factory Automation, ETFA, 2019-Septe:578–585, 2019.
[15] Wilko Schwarting, Javier Alonso-Mora, and Daniela Rus. Planning and
Decision-Making for Autonomous Vehicles. Annual Review of Control,
Robotics, and Autonomous Systems, 1(1):187–210, 2018.
[16] Dominik Wee Seunghyuk Choi, Florian Thalmayr and Florian Weig. Ad-
vanced driver-assistance systems: Challenges and opportunities ahead.
https://www.outrider.ai/system/, accessed 23.04.2020.
[17] Tixiao Shan and Brendan Englot. LeGO-LOAM: Lightweight and
Ground-Optimized Lidar Odometry and Mapping on Variable Terrain.
IEEE International Conference on Intelligent Robots and Systems, pages
4758–4765, 2018.
46
48. [18] Spencer Soper. Amazon is looking for the perfect warehouse worker.
https://www.bloomberg.com/news/articles/2015-05-28/robot-with-a-
human-grasp-is-amazon-s-challenge-to-students, accessed 23.04.2020.
[19] Shuyou Yu, Yang Guo, Lingyu Meng, Ting Qu, and Hong Chen.
MPC for Path Following Problems of Wheeled Mobile Robots. IFAC-
PapersOnLine, 51(20):247–252, 2018.
47