SlideShare a Scribd company logo
1 of 52
Download to read offline
1
2
1. Acronym Definitions 5
2. Introduction 6
2.1 Background and Motivation 6
2.2 Overview of proposed project 6
2.3 Deliverables 7
3. Description of Project – Tier 1 Level 8
3.1 Top-level diagrams and system flowcharts 8
Tier 1 Hardware System Overview 8
Li-ion Battery 8
Battery Management System (BMS) 9
Voltage Regulator 9
Microcontroller 9
Motor 9
MicroVision Lidar PSE-0400li3-101 Sensor 9
Tier 1 Software System Overview 10
Microcomputer 11
Processing Server 11
PCD Database on S3 11
Application Server 11
Web Viewer / Web App 11
3.2 Identification of problems and issues 12
4. Description of Project - Tier 2 Hardware 13
4.1 Identification and specifications of major components 13
3D Printed Battery Case 14
Battery Management System (BMS) 14
Voltage Regulator 15
Power PCB 16
Raspberry Pi 4 Model B 17
Stepper Motor 18
Stepper Driver 18
Rotary Encoder 19
MicroVision LiDAR PSE-0400li3-101 Sensor 20
Rotating Platform 21
4.2 Layout diagrams for specific components and subsystems 25
3
Battery Management System (BMS) 26
Stepper Driver 28
Lidar Scanner and Rotary Encoder 29
4.3 Explanation of how software interfaces with hardware 30
4.4 How hardware was fabricated and assembled 31
5. Description of Project - Tier 2 Software 32
5.1 Identification of software that will be used 32
5.2 System requirements to run software 33
Raspberry Pi 4 B 33
IDE 33
5.3 Software Systems Overview 34
5.4 Details of Software Systems 35
Application Server 35
Processing Server 36
Overview 36
Processing Script 36
API Endpoints 36
PCD Database 37
Overview 37
Rotary Encoder 38
LiDAR Sensor 38
Motor Driver 39
ROS Bag 39
Processing Server 40
Web App 41
41
6. Integration and Testing 42
6.1 Procedure for testing and debugging 42
Hardware Testing and debugging 42
Testing: BMS / Li-Ion charger 42
Testing: LiDAR sensor, voltage regulation and current 42
Testing: Switching frequency of the voltage regulator 42
Testing: Stepper motor test & stepper driver Test 43
Software Testing and debugging 44
4
Tools 44
Overview 44
Automated tests 44
Routes 44
Manual Tests 45
6.2 Final results 46
7. Bill of Materials (BOM) 48
9. List of Industry Standards 49
IEEE 802 standards 49
IEEE 1394 49
Inter-integrated Circuit (I2C) Protocol 49
Laser Safety Standard 49
Python 2 49
10. Acknowledgements 50
11. Bibliography 51
12. Appendices 52
1. Google Cartographer - high level system overview 52
5
1. Acronym Definitions
● API - Application Programming Interface
● AWS - Amazon Web Services
● BMS - Battery Management System
● CAD - Computer Aided Design
● EC2 - Elastic Compute Cloud
● IDE - Integrated development environment
● Li-ion - Lithium-ion
● LiDAR - Light Detection and Ranging
● MVIS - MicroVision
● OpenCV - Open Computer Vision
● PCD - Point Cloud Data
● ROS - Robot Operating System
● S3 - Simple Storage Service
● SLAM - Simultaneous Localization and Mapping
● TOF - Time of Flight
6
2. Introduction
2.1 Background and Motivation
LiDAR (Light detection and Ranging) is a sensor that is able to use light to measure range. The
system is able to pulse a laser at a scene and measure the distance to a certain object by using the
roundtrip travel time and the speed of light.
Although LiDAR was first used after the invention of the laser in 1960, the underlying concept
has existed since 1938 when cloud heights were measured using pulses of light (Gregersen).
Edward Hutchinson Synge, known as Hutchie, first discovered the concept of LiDAR by having
an observer aim searchlights at the night sky, while having a reflector bounce light into a
photodetector (Neff). By using a fixed angle, the altitude could be determined (Neff). LiDAR
was first implemented on airplanes and used to map the ground (Gregersen). This is done by
rapidly firing LiDAR pulses at the ground and collecting the pulse that returns upon reflection.
GPS and an IMU (inertial mass units) sensor are also used to determine the position and
orientation of the airplane (Gregersen). Our project utilizes stitching to create a 360-degree depth
view of an interior space.
The indoor room mapping system will be used by MicroVision to demonstrate to potential
customers their new LiDAR’s capability. In addition, indoor room mapping can be used to
generate maps of indoor environments to help people navigate, or for first responders to
familiarize themselves with an environment they are en route to, among other uses.
2.2 Overview of proposed project
Our Capstone project involved the design and implementation of a system where a user can
capture 360° scans of an indoor environment in 3D using the MicroVision LiDAR sensor. The
user can view the result in our custom-built website or their own point cloud viewer. Our
industrial sponsor, MicroVision, has recently begun to market their new indoor LiDAR sensors
and they wanted a system that they can use to show potential customers the capabilities of their
new LiDAR.
Our project included writing software to process, filter, stitch, and view the data. Our prototype
processed the data from the sensor and displayed it on the website. A rotating platform was
implemented to autonomously rotate the sensor 360° to capture data hands-free. A system to
portably power the system was also designed. Our prototype is operated wirelesses through the
‘cloud’.
7
2.3 Deliverables
● Implemented a stitching algorithm to capture a static, time-averaged 360-degree depth
view of an interior space, such as a room, using MicroVision’s Consumer LiDAR sensor.
● Processed the obtained data such that it can be viewed and manipulated on a laptop PC.
● Automated the capture of a 360-degree scan of the room through the use of a motion-
controlled platform.
● Made the platform, sensor, battery, and any other computational hardware operate
untethered.
8
3. Description of Project – Tier 1 Level
3.1 Top-level diagrams and system flowcharts
Tier 1 Hardware System Overview
Figure 3.1.1 Hardware System Overview Diagram
The diagram in Figure 3.1.1 shows our Tier 1 block diagram for the design of our hardware
system. The entire system is powered by Lithium-ion batteries (1NR18650-35E), which feed into
the Battery Management System (VUPN712). The BMS (Battery Management System) is
connected to a voltage regulator (D24V50F5), which supplies the required power to each device.
The regulator powers the microcontroller and the LiDAR sensor. The stepper driver powers the
stepper motor, and the driver is powered directly from the BMS board output. The
microcomputer connects to the stepper driver to tell the motor when to start or stop.
Li-ion Battery
The power source consists of three series connected 18650 Li-ion rechargeable battery
cells. Each cell produces approximately 3.6V and a capacity of 3.5Ah depending on
discharge rate.
9
Battery Management System (BMS)
The BMS is a circuit board that charges and discharges the Li-ion batteries safely. It is
designed to have over-current, over-voltage protection if the load requested in the circuit
exceeds greater than what the battery can handle. In addition, the BMS helps to check the
balance of each cell and operates the battery with the required specification. The BMS is
not being used to charge the batteries. A separate external charger is used for charging the
batteries.
Voltage Regulator
The output voltage from the BMS is sent to two 5V fixed output buck converters (voltage
regulators), each capable of producing 5A. One converter operates the Raspberry Pi 4, the
other the lidar sensor. The output from the converters go to a custom circuit which filters
switching converter noise, provides overvoltage protection and fusing.
Microcontroller
A Raspberry Pi 4 (PI4-2GB-K102-1) microcomputer is being used as central hub to
control everything in the system. It will be responsible for starting and ending the
scanning process. It controls the steps and direction of the stepper motor, provides a
main interface for the lidar scanner, and communications with our cloud server to
transmit all data captured and receive all commands from the website.
Motor
To control the direction the scanner is facing, a motor has been used to rotate the
scanning platform to the desired angle. In order to place the platform to the requested
position for the sensor to scan, the microcontroller is used to instruct and monitor the
status of the motor.
MicroVision Lidar PSE-0400li3-101 Sensor
By emitting a short laser burst and collecting the round-trip time, the distance of the
object is determined. By firing many lasers at a given object, a 3D representation of
the object can be constructed.
Figure 3.1.2 MicroVision LiDAR
(“Consumer LiDAR Product...”)
10
Tier 1 Software System Overview
Figure 3.1.3 Software Overview Flow Chart
The Tier I software flowchart our system is shown in Figure 3.1.3. Our system
requires software to be implemented on the Raspberry Pi 4 Model B, a processing
server, an application server, a point cloud database, and a website. When the system
has been powered up and the user presses the scan button, the system initializes and
begins scanning. Data received is stored locally on the Raspberry Pi 4. Once scanning
is complete, the data is transferred to the processing server. The processing server
uses our custom stitching algorithm, to stitch together the data into a cohesive model,
and then store the model in a database. At any point after processing, a client
computer can view the point cloud (a collection of XYZ points in 3D space) in a web
browser.
11
Microcomputer
The microcomputer, such as a Raspberry Pi 4, runs software to enable the capturing
of data from the LiDAR sensor. The microcomputer temporarily stores the data
onboard until the scan is complete and then transmits it to the server. It is powered by
the onboard power system.
Processing Server
The processing server is an AWS EC2 instance. EC2 is a service provided through
Amazon that allows for a virtual server to enable cloud computing. Processing point
clouds is computationally intensive; EC2 allows us to host our code on a Linux
machine in the cloud with the computational power to do our processing. It has our
stitching algorithm running that combines all the scan frames into one model. Once
finished it posts the model to the PCD Database.
PCD Database on S3
The database is be something like an AWS S3 service that stores the generated
models in a PCD file format. Amazon’s S3 service allows for object storage in the
cloud. It provides programmatic cloud file storage.
Application Server
This server relays information and commands from the Microcomputer to the web viewer and
vice versa through a TCP connection on the server. The Application server also transfers data to
and from the Web App, such as scan data and static files required to run the website.
Web Viewer / Web App
A 3D web-based display engine such as a Three.js has been implemented to display
the processed PCD file, which is retrieved from the Application server. Three.js is a
cross browser library that enables the viewing of 3D computer graphics in a web
browser. The processed point cloud is shown in such a way that enables the viewer to
move, rotate, and zoom within the generated point cloud. The viewer can be also
oriented with a top-down perspective or a first-person perspective - giving a multiple
perspective of the room. In addition, a depth shader has been introduced to more
easily gauge depth in the point cloud.
The Web Viewer is also used to send commands to the Microcomputer, such as,
changing the resolution, changing the amount of rotations, giving the scan a name,
and initiating the scan. The Web App displays the status of the scan.
12
3.2 Identification of problems and issues
There are some issues that came up during the development and implementation of
this project. Some issues include, but are not limited to:
• Depth Data Stitching
Initially our plan was to use a SLAM (simultaneous localization and mapping) algorithm
to stitch our frames together. We were using a SLAM implementation that Google
created called Cartographer. However, this approach proved too difficult for the amount
of time we were given. We could not get the system configured in the way Cartographer
expected. Instead we implemented our own stitching algorithm that combines frames
based on the rotation of the device. A diagram of the inner workings of Cartographer is
included in the appendix.
• Motor Control
Initially, we attempted to implement ramping (acceleration / deceleration) of the motor to
more smoothly control our platform rotations. After much effort made to implement this
feature, we decided to control our platform without ramping. Time spent on this feature
was resulting in a block on progress, so we decided as a team to move forward with a
constant speed. We believe the issue was that the Raspberry Pi could not send signals fast
enough through the GPIO pins.
Lastly, we experienced an issue in capturing depth data due to jitter introduced into the
system by the motor. We were able to fix this by introducing a 0.5 second delay after
each motor command.
• BMS (Battery Management System)
The BMS did not show consistent performance; sometimes it would turn on and
sometimes it wouldn’t. This occurs whenever the batteries are replaced. BMS boards are
typically designed to be permanently connected to the batteries, and this may have been
the root cause of the issue. We fixed this after rigorous testing and research by connecting
a push button in between the B+ and P+ terminals of the BMS as shown in figure 4.2.2.
When batteries are initially inserted and before activating the power toggle switch, the
pushbutton is pressed to reset and turn on the BMS.
• Battery Case
The first battery case that we tried implementing had contact issues that preventing the
Li-ion batteries from have a proper connection to the metal battery terminals on the case.
To overcome this issue, we designed our own 3D printed battery case that uses better
contact points that are designed to be used with Lithium-ion batteries. In addition, the
battery case included wire harnesses, a BMS holder, and a push button holder to make the
routing neater.
13
4. Description of Project - Tier 2 Hardware
4.1 Identification and specifications of major components
4.1.1 Lidar Scanner components
In order to have a working Lidar system, the major system components that are needed are in the
list below. The specifications of each component were researched and checked to match our
desired system build.
Lithium-Ion Battery (Li-ion)
Figure 4.1.1 Samsung 18650 Li-ion Battery
Figure 4.1.1 shows the specification and part image of the battery cell used. The Samsung
18650 is a Lithium-ion battery chemistry that is 18.5mm wide and 65.25mm long. The Li-Ion
battery chemistry is able to hold large amount of electrical charge. This particular model
number: 1NR18650-35E is rated at 3500 mAh and capable of delivering 3.5 A for 1 hour at a
nominal voltage of 3.6V. This voltage can go up to 4.2V. For this project, 3 of the cells were
connected in series to create at total output voltage of 10.8-12.6V, depending on the battery
charge, at 3.5A. This amount of energy is enough to power all system components for
approximately one hour.
14
3D Printed Battery Case
Figure 4.1.2 Battery Holder
The battery holder shown above is designed and printed to hold three 18650 batteries
connected in series. This battery holder was designed to allow easy replacement of the
batteries, and to provide good electrical contact with the battery terminals. The BMS senses
the voltage from each battery cell, and the cells are connected in series. There is a cavity to
locate the BMS board, and there is also a wire harness to route power cables cleanly. This
allowed us to easily connect the batteries in parallel to the BMS board. There is also a holder
for a push button that acts as a reset switch for the BMS. Each time the batteries are replaced,
the push button must be pressed to reset the BMS.
Battery Management System (BMS)
Figure 4.1.3 BMS
Figure 4.1.3 depicts the BMS used in our system. It is a 3S BMS, which means that it is made
for connecting three batteries in series. This matches our spec of three Li-ion batteries. This
BMS also has a limit of 8A on the output current. This matches our specification with some
additional headroom, since our system will only need around 5A. This BMS also has various
15
circuit protections, such as over-current protection, short circuit protection, and over-
discharge protection. BMS quiescent current during operation is approximately ~50𝜇A.
Voltage Regulator
Figure 4.1.4 5V, 5A Step-Down Voltage Regulator
The step-down voltage regulator shown in Figure 4.1.4 is used to lower output voltages to
ensure that each device in the system receives the corresponding voltage, since our batteries
output around 11.1V, and the rest of the system requires a lower voltage. This is a DC-DC
switching buck convertor that operates at 90% efficiency. The 5V voltage regulator can
accept an input voltage up to 38V. Two voltage regulators were used: one for the LiDAR
sensor and one for our microcomputer (Raspberry Pi). An outputted 5V connects to each
device. Note that the 5V regulator can output a continuous current of 5A maximum. This is
not a problem, however, because the current load for the regulators does not surpass 5A. Two
low-pass filters were used for the outputs of each voltage regulator. This is to reduce noise
that may be induced by the voltage regulators.
16
Power PCB
Figure 4.1.5 Power PCB
The PCB shown in Figure 4.1.5 was designed to house two voltage regulators, two Pi filters,
two fuses and two Zener diodes. The PCB is 1.9 inches by 1 inch. It can be reduced to 0.8
inches width, if desired, because there is an extra 0.2 inches at the top of the board can be
sheared off. There are keep out and restricted areas around each screw hole to prevent the
copper ground plane from touching the screws. A single layer ground plane is on the bottom
of the board. The trace width is 24 mils, which is calculated using a current of 3.3 A through
each trace and a trace thickness of 2 oz/ft2
. There are three text points: GND, voltage
regulator 1, and voltage regulator 2. The voltage regulators are mounted on top of this board
using the two top screw holes and the female four-pin headers at the bottom of the board. The
input connects to the board at the right angle, JST four pin header at the top right of the
board. The two-right angle, JST 2 pin headers at the top of the board are the outputs of the
board. They connect to the LiDAR sensor and the Raspberry Pi.
17
Raspberry Pi 4 Model B
Figure 4.1.6 Raspberry Pi 4 Model B
The Raspberry Pi 4 Model B (PI4-2GB-K102-1) acts as the central controller for the LiDAR
scanner system. It is responsible for coordinating the movement of the stepper motor, direction
and speed, gathering LiDAR scan data from MicroVision LiDAR Sensor and it is responsible for
sending lidar data to a server for further processing. To achieve all of the above tasks, the
Raspberry Pi 4 comes with 4 I/O ports and onboard systems.
As shown in Figure 4.1.6, the board is powered by an Arm Cortex- A72 processor. This
particular processor is a 64-bit architecture, capable of running up to 1.5 GHz. It is backed by a
2GB RAM, adequate enough to run all of the required processes needed for the working lidar
scanner. Furthermore, the board comes with 2x USB 2.0 port and 2x USB 3.0, providing us 4x
ports to connect our lidar sensor and any additional USB hardware required for the operation of
the system. The LiDAR sensor should be used with the USB 3.0 ports for best performance. To
store the collected data sent by the sensor, the Raspberry Pi 4 Model B has a built in SD card slot
for storing large amounts of data. Upon completing the scan, the stored data is sent to a remote
server for processing. This is done with the onboard 2.4 GHz/ 5 GHz, 802.11 WIFI chip.
The IEEE 802.11.b/g/n/ac standard transfers files at max speed of 1300 Mb/s. To control and
monitor external circuits, such as the encoder and stepper motor, the board comes with 40 GPIO
pins. These pins can be configured to read/write pulse width modulation, digital and interrupt
signals.
18
Stepper Motor
Figure 4.1.7 Nema 14 Stepper Motor
To get a precise scan, a stepper motor (14HS10-0404S) is used to rotate the lidar sensor. The type
of motor chosen is a stepper motor. Stepper motors are generally used in a variety of applications
where precise position control is desirable, and the cost or complexity of a feedback control
system is unwarranted. The Nema 14 is a hybrid bipolar stepping motor that has a 1.8° step angle
(200 steps/revolution). Each phase draws 500 mA at 12V, allowing for a holding torque of 1 kg-
cm (14 oz-in). The motor has four color-coded wires terminated with bare leads: black and green
connect to one coil; red and blue connect to the other. It can be controlled by a pair of suitable H-
bridges (one for each coil).
Stepper Driver
Figure 4.1.8 DRV8825 Stepper Driver
To control the speed and direction of the stepper motor, a stepper motor driver is required.
The DRV8825 is a stepper driver capable of operating the stepper driver for our system. It is
within the required voltage and current for the motor to work with. The Chip is capable of
delivering up to 2.5A of current, well enough to meet the maximum required current of the
stepper motor. In addition, the DRV8825 can operate at 1/2, 1/4, 1/16, and 1/32 microsteps.
Microstepping control divides each full step into smaller steps to help smooth out the motor’s
rotation, especially at slow speeds. For example, a 1.8-degree step can be divided up to 32
times, providing a step angle of 0.05 degrees (1.8 ÷ 32), or 6400 microsteps per revolution.
19
Figure 4.1.8.1 Step Resolutions
Our design allows us to change the step resolution on our interconnect PCB (the PCB shown
in figure 4.1.11) to get the best performance. The step resolution can be changed manually by
moving a jumper found on the interconnect PCB. Table 4.1.8.1 shows several possible step
resolutions.
Rotary Encoder
Figure 4.1.9 Rotary Encoder
To ensure the accuracy of the scan, a rotary encoder (S4T-200-236-S-B) shown in Figure 4.1.9 is
used. In the event of step loss from the stepper motor, a rotary encoder is used to confirm the
position of the real stepper motor with the expected position from the software. If the system
detects the disagreement between the two readings, it assumes an improper scan has been
executed and restarts the system for a new scan.
20
The encoder to used is SRT Miniature Optical Shaft Encoder. Optical encoders use light instead
of contacts to detect position, so they are inherently free from contact wear. For our system, a
200 CRP encoder is used. Cycles Per Revolution (CPR) are the measurement of the accuracy of
the encoder. One complete rotation of the encoder shaft is 360 mechanical degrees. A 200 CPR
encoder can provide 200, 400 or 800 positions per revolution depending on whether x1, x2 or x4
quadrature decoding is done.
MicroVision LiDAR PSE-0400li3-101 Sensor
Figure 4.1.10 MicroVision LiDAR
The figure above shows MicroVision’s MEMS Based 3D LiDAR Engine, which is the sensor
utilized in our system. This sensor’s data was used to construct 360-degree 3D layouts of rooms,
but there are many more future possibilities in AI and machine learning. Constructing 360 3D
scans is just the first step for future use.
(Consumer LiDAR Product...)
21
Rotating Platform
The rotary platform is intended to hold all of the required hardware items to operate our indoor
LiDAR mapping system. MicroVision designed a mechanical platform for this project based on
our specific requirements. Figure 4.1.11 shows the front view and Figure 4.1.12 shows the side
view of the rotating platform. The diagram below is intended to show the mechanical
connections between components.
Figure 4.1.11 Front View of the Rotating Platform
The platform used a stepper motor to rotate the top platform using a timing belt pulley mounted
on an 8mm shaft. The top rotating platform holds the MicroVision LiDAR sensor, Raspberry Pi
4, and voltage step-down board. The bottom stationary platform holds the battery module,
stepper motor, stepper motor driver, encoder and timing belt. The two platforms are connected
electrically by the use of slip ring. The stepper motor driver is mounted onto the Interconnect
PCB, which was designed MicroVision. This PCB also has a connector that is connected to the
slip rings. Any connection made between the top and bottom platforms will pass through this
connector and through the slip rings. Since the platform is self-contained and designed to be
untethered it can continuously rotate; there are no mechanical stops, since our system is closed
loop. The platform uses spacers to implement a multi-floor design, making it versatile if more
floors are needed. Threaded rods running through the spacers and nuts hold the platform floors
together.
Figure 4.1.12 shows the GT2 timing belt. The GT2 timing belt is used to synchronize the motion
control, which includes the stepper motor, rotary encoder and rotating top platform. By using a
1:1.8 gear ratio, the speed and direction of the rotating platform can be set. The stepper motor
requires 200 steps to make 1 full rotation since each step is 1.8° (200 x 1.8° = 360°). The Stepper
22
motor is connected to a rotating base with a 1:1.8 geared pulley. The turn table therefore requires
200*1.8 = 360 full steps to rotate 1 full turn, and 60 steps for a 60-degree motion. If the 1/32
microstep resolution is selected on the stepper driver, one full turn of the top rotating platform
requires: 360*32 = 11520 microsteps.
Figure 4.1.12 Side View of the Rotating Platform
23
Figure 4.1.13 Constructed Platform from Design (First Iteration)
The above image in Figure 4.1.13 shows a prototype of the platform without the top portion of it.
This prototype was built by MicroVision. It was still under development at the time, and the final
version consisted of metal parts.
24
Figure 4.1.14 Final Constructed Platform from Design
The image in Figure 4.1.14 shows the final build of the complete system. The platform was
designed and built by MicroVision.
25
4.2 Layout diagrams for specific components and subsystems
Figure 4.2.1
26
Figure 4.2.1 outlines the complete wiring diagram for the lidar scanner system. Each hardware
component is interconnected with each other through the pins found on the device. The above
diagram requires key components and devices such as the battery management system (BMS),
voltage regulator, stepper driver, rotary encoder, Raspberry Pi and a LiDAR scanner to construct
the system. In addition, passive components such as inductors and capacitors are also used. Since
each component and device play an important role in the overall function of the system, key
components and devices are described in the following section.
Battery Management System (BMS)
Figure 4.2.2 Battery Management System (BMS)
Figure 4.2.2 shows the input and output diagram for the BMS. The battery cells are connected in
series to provided power to the load. But each cell’s voltage is an analog input to a difference
amplifier on the BMS board. The BMS monitors each battery’s performance under discharge
regardless of their position in the series chain. The positive and negative terminals of the
batteries are connected to the B-inputs. B+ is the positive input of the first battery, B1 is the
negative input of the first battery and the positive input of the second battery, and so on. P+ and
P- are the terminals that the BMS outputs through. In addition, there is a push button linking B+
and P+, and a switch connected to P+.
27
Power PCB
Figure 4.2.3 Power PCB Circuit Diagram
Figure 4.2.4 Power PCB Eagle Schematic
28
Figure 4.2.3 shows the diagram of the components within the Power PCB. Figure 4.2.4 is the
Eagle schematic of the PCB. The power PCB contains two voltage regulators, two Pi filters, two
fuses, and two Zener diodes. The Voltage regulators come with 5 pins; Enable, Vin, VinGND,
Vout, VoutGND. The voltage regulators are connected to the output of the battery management
system (BMS) through the pin Vin (2) and VinGND (3). The output of the step-down voltage is
connected via the pin Vout (6) and VoutGND (4). The En pin gives the option to switch between
an on state and a low-power state. The En pin can be disconnected, as it is in our current system,
if this feature is not desired. Each voltage regulator is connected to a low-pass filter, fast-blow
resettable fuse, and 5.6V Zener diode. The low pass filter is a Pi type filter and it is used to
reduce noise in the form of voltage fluctuations from the voltage regulators. The peak to peak
voltage of the voltage regulators originally was 58.2 mV (with a switching frequency of 151.5
kHz). Our Pi filter was designed with a cutoff frequency of 3.63 kHz. After filtering the
regulators, the output voltage dropped to 2.77 mV (Vp-p). The fast-blow fuse is used for
overcurrent protection, and the Zener diode is used for overvoltage protection.
Stepper Driver
Figure 4.2.5 DRV8825 Stepper Driver
Figure 4.2.5 shows the circuit diagram of the DRV8825 integrated motor driver. The DRV8825
comes with 16 pins. Pins from 1 to 8 control properties of the stepper driver. Pins in this group
include: Enable, Mode select, sleep, reset, step and direction pins. Step, direction and enable pins
29
are connected to the GPIO pins on the Raspberry Pi 4. To control the step resolution, pins 2,3,4
for mode select are also connected to the GPIO pins.
The output of the DVR8825 is located in pin 9-16. This pin sends power to the motor driver, 4
ports for stepper motor stator cables and ports for stepper motor power. Pins A1, A2, B1, B2
connect to the stepper motor’s four stator cables. The DRV8825 can be powered with a supply
voltage between 8.2 and 45 V and is capable of providing an output current up to 2.5A full-scale.
To reduce voltage ripples, a 100uF capacitor is required to be placed between pins 15 and 16. To
reduce power consumption in the idle state, a low-power sleep mode is included by toggling the
enable pin.
Lidar Scanner and Rotary Encoder
Figure 4.2.6 MicroVision Lidar Scanner
Figure 4.2.6 show the connections to the MicroVision PSE-0400li3-101. The LiDAR contains
three input/output pins that used for building our desired system. Pins Vin and GND are the input
voltage for lidar unit. The sensor requires an input voltage of 5V at 2 Amps. The input voltage is
supplied from the voltage regulator to the Vin and GND pins. To receive the output values of the
scan from the lidar unit, the USB port of the lidar is connected to the USB port of the Raspberry
Pi 4 Model B. The USB protocol used to transfer data is USB. 3.0. It contains sufficient
bandwidth to transfer data between the two devices.
To ensure scans are done correctly, the rotary encoder chip is also connected to the GPIO pins of
the Raspberry Pi 4 Model B. The rotary encoder is linked mechanically to the stepper driver and
provides constant feedback to Raspberry Pi to ensure required steps are being executed properly.
Pins A+, A-, B+, B- are connected to the GPIO pins. Pins Vin and GND are connected to the
Raspberry Pi’s onboard power. Pins 5V and GND are used.
30
4.3 Explanation of how software interfaces with
hardware
Our Lidar scanner system required an interface between hardware and software to work properly.
Core components such as the stepper driver chip (DRV8825) and MicroVision LiDAR sensor all
require the provision from the onboard software to monitor and instruct each of the components.
To achieve such tasks, the GNU/Linux operating system is used.
Figure 4.3.1 Layer diagram of software and hardware interfaces in GNU/Linux operating system
Figure 4.3.1 outlines the system architecture of the GNU/Linux operating system. The operating
system breaks the execution of programs into two major categories, kernel space and user space.
The kernel space is reserved for the most important processes. When a user runs an application
or tool, that application or tool executes in what is called user space. Since applications can come
from a variety of sources, some may be poorly developed or originate from unknown sources. By
running these applications separate from kernel space, it prevents programs from tampering with
the kernel resources and causing the system to panic/crash.
The hardware consists of all peripheral devices (RAM/ HDD/ CPU, I/O) and is accessed by the
kernel. In order to get the most out of the system, core parts of the lidar system are controlled by
programs running in the kernel space. This ensures that the required process is executed on
demand by the CPU. Other processes such as post processing that do not require a direct access
to the hardware are executed in the user space. The different levels of privileges for each process
guarantees the execution of important processes over less important processes.
31
4.4 How hardware was fabricated and assembled
Each component was bought and then integrated each into our Power PCB. All chips were
surface mounted onto the PCB. The PCB is designed to provide intermediary connections
between surface mounted chips. Two low pass Pi filters were built and soldered directly onto the
PCB where the outputs of the voltage regulators are. The battery holder shown in figure 4.1.2
was placed on the platform and used to hold the batteries in place. Input and output connectors
were soldered onto the PCB and used to connect to the PCB.
32
5. Description of Project - Tier 2 Software
5.1 Identification of software that will be used
● Frontend Website Technologies
○ HTML/CSS/JS
■ Vue.js - A JavaScript framework, that helps modularize web components.
■ Typescript - A strictly typed programming language that compiles to
JavaScript.
■ Webpack - A JavaScript module bundler.
■ SCSS/SASS - a stylesheet language that compiles to CSS
○ Node.js
■ Express - A web application framework.
■ Socket.io - A library that abstracts away the socket interfaces. Quickly
enables TCP socket events.
● Amazon Web Services (AWS)
○ Simple Storage Service (S3) - an object storage service.
○ Amazon Elastic Compute Cloud (EC2) - A service that provides cloud servers.
● Point Cloud Processing and sensor interfaces
○ Python
■ ROS
■ RPi.GPIO
■ RpiMotorLib
■ OpenCV
33
5.2 System requirements to run software
Raspberry Pi 4 B
A Linux distribution such as Ubuntu or Raspbian was installed on the Raspberry Pi to run the
onboard code. A Python 2 interpreter was installed to run the program. Python 2 is a cross
platform language that can be interpreted on different boards. If another microcomputer was
required to replace the Pi, porting the code can be easily done. Only modifying the GPIO library
was needed because the board uses I/O pins to communicate with the motor driver and rotary
encoder. In order to optimize efficiency and speed, unneeded processes running on the Pi are
disabled, and our program is set to auto boot on startup.
IDE
Various IDEs (Integrated development environment) are utilized to facilitate the writing of
Python code. The IDEs used are: Visual Studio Code, Atom, and PyCharm. IDEs are preferred
over the terminal because they allow vertical integration of writing code, testing, version control,
formatting, and more.
34
5.3 Software Systems Overview
Our system consists of 5 major components: The Processing Server, the Application Server, the
Web Viewer (Web App), and the Microcomputer (see Fig. 5.3.1). The web viewer initiates a
scan by emitting an event. the Microcomputer receives the event and begin scanning the room.
The Microcomputer sends the resulting scans to the Processing server. The Processing server
stitches together the LiDAR scans and stores the final model in the S3 storage. Finally, a user
uses the Web Viewer to view the models.
Figure 5.3.1 Software System Architecture
35
5.4 Details of Software Systems
Application Server
Overview
The Application Server achieves multiple purposes. It
facilitates communication between the WebApp and the
Microcomputer, serves the WebApp to a client browser,
provides access to the saved PCD files, and provide
metadata about what each PCD file represents.
Event Handler
The Event Handler facilitates dynamic communication
between the WebApp and the Microcomputer and
retains information about what transpired. This enables
settings for the Scanner to be adjusted by the WebApp.
Retaining the metadata of the scan is also important
because the PCD database only retains the name of a
scan. Otherwise, information regarding the scan’s
resolution, rotation count, and elapsed time would be
lost. That data is valuable for evaluating the
performance of the system and should therefore be
stored. Documentation for each event is given in next
section.
API Endpoints
API (Application Programming Interface) Endpoints are exposed to the WebApp for passing
static data, such as PCD files, HTML and other static files, and metadata of scan.
Figure 5.4.1 Application Server
Architecture
36
Processing Server
Overview
The processing server is responsible for taking the BAG file
generated by the Microcomputer and outputting a stitched model of
the room in PCD format. The microcomputer passes the bag file to
an API endpoint, which passes it the initializer process that begins
the stitching process. When stitching is completed, it passes the
PCD to the PCD Database.
Processing Script
This is a script that starts up a stitching process node, and then
begins the data transfer into itself, from the S3 database. Once the
above processes have completed, the final PCD file is added back
into the S3 Database.
API Endpoints
API (Application Programming Interface) Endpoints are
exposed to the Microcomputer for passing the BAG file
that contains the scan information. Figure 5.4.2 Processing Server Architecture
37
PCD Database
Overview
The PCD Database uses Amazon Simple Storage Service (S3) for file storage. It stores final
stitched PCD files and the Bag files (a file format type that stores message data) that are
processed by the server. The Processing server uploads the BAG and PCD Files, and the
Application server downloads the PCD files. The structure of the data is as follows:
Figure 5.4.3 PCD Database
38
Rotary Encoder
Figure 5.4.4 Rotary Encoder
The rotary encoder flowchart in Figure 5.4.4 shows how information is gathered from the rotary
encoder and made available to be used. The encoder interfaces with the Raspberry Pi through a
GPIO library called RPi.GPIO. The rotary encoder code acts as a publisher, publishing the
current angular position of the rotary encoder. After initialization, the rotary encoder software
waits until it receives a pulse on a I/O pin. After receiving a pulse, a subroutine increments the
pulse counter, and convert the total number of pulses to degrees. A global angular position
variable is updated and published.
LiDAR Sensor
Figure 5.4.5 LiDAR Sensor
The LiDAR sensor flowchart in Figure 5.4.5 demonstrates how LiDAR depth information is
gathered and outputted. The LiDAR sensor code acts as a publisher, publishing depth frames
from the LiDAR. After initialization, OpenCV (Open Computer Vision) is used to connect to the
sensor and open a connection. The software is then in a waiting state until a capture request is
received. After receiving a capture request, the LiDAR depth data is saved. The depth frame is
then processed and converted to Cartesian coordinates. Lastly, the depth frame is published.
39
Motor Driver
Figure 5.4.6 Motor Driver
The motor driver flowchart in Figure 5.4.6 demonstrates how the motor driver resulted in
platform rotation. The motor driver code interfaces between the Raspberry Pi and the motor
driver board through a library called RpiMotorLib. After initialization, the software establishes a
connection to the driver board through the I/O pins on the Raspberry Pi. The software then enters
a waiting state until a request to move the motor has been received. After a request is received,
the Raspberry Pi communicates to the driver board the number of steps, acceleration, speed, and
direction in which to move.
ROS Bag
Figure 5.4.7 ROS Bag
The ROS Bag flowchart in Figure 5.4.7 shows the process of the bag file creation. ROS (Robot
Operating System) is the framework our project utilized to compile all of our sensor data
together. The bag file acts as a subscriber to the sensor nodes, which then publishes data to it.
The chart above consists of two main publisher nodes: rotary encoder and LiDAR. These two
nodes publish their data, which is then compiled into a ROS bag and sent to the S3 database.
40
Processing Server
Figure 5.4.8 Processing Server
The processing server flow chart shows the end to end process from a raw Bag file to a complete
PCD file. After receiving a bag file, the stitching algorithm is initialized with the metadata of the
scan. The algorithm is then executed to convert the Bag file to into a point cloud data set in the
form of a PCD file. The completed point cloud PCD file is then exported to the S3 database.
41
Web App
Overview
The web App is the main interface between the user and the hardware. A user is able to send the
hardware commands, see what state the scanning process is in, and view all previously generated
scans in the web viewer. The important screens are detailed here.
Home
This page gives an overview of the
project. A user can learn more about
the project, begin the scanning
process, or view one of the most
recent scans.
LiDAR Scanning
This page allows the user to send
commands to the hardware. They
can adjust the resolution, name the
file, and set the number of rotations.
The state and progress of the scan is
also displayed.
Point cloud viewer
This page displays a generated point
cloud to the user. They can rotate,
scale, and move the scan using the
keyboard and mouse. Metadata is
also displayed.
Figure 5.4.9 Web App
42
6. Integration and Testing
6.1 Procedure for testing and debugging
Hardware Testing and debugging
Testing: BMS / Li-Ion charger
Tests were run on the BMS and the Li-ion charger. First, the batteries were connected to the
BMS and connected the output of the BMS to a dummy load (resistors). An oscilloscope and a
multimeter to measure the voltage across the load to make sure that the BMS is working as
intended. The batteries were discharged during the previous test, and then the charger was tested.
This was done simply by plugging the batteries into the charger and checking that each terminal
of the charger works and is successfully charging the batteries.
Testing: LiDAR sensor, voltage regulation and current
The LiDAR sensor was tested to be operational. Once the battery power was attached to the
system, the sensor was checked for functionality. Test scans on the sensor were taken prior to
attaching it to the system. Next, the voltage regulator was tested by connecting it to a resistor
dummy load and measuring the voltage across it and the current through it. Figure 6.1.1 shows
the results of the voltage regulator test. The measurements were taken via an oscilloscope and/or
multimeter.
Figure 6.1.1 Voltage Regulator Performance Test
Testing: Switching frequency of the voltage regulator
The 5V voltage regulator that was used is of the switching type. As a result, there is additional
noise induced by the regulator that needed to minimize. This noise can cause problems in the
operation of the LiDAR sensor and the Raspberry Pi because the input voltage may fluctuate
beyond what the two devices need. Therefore, a low-pass filter was added to the output of the
voltage regulator. Prior to that, two devices were tested without a filter to see if it still worked.
When designing the filter, the switching frequency of the regulator needed to be determined.
This was accomplished by connecting a load resistor to the regulator’s output and analyzing the
resulting output on an oscilloscope. Then the filter was designed to have a suitable cutoff
frequency. Figure 6.1.2 shows the results of the low-pass filtering.
43
Figure 6.1.2 Voltage Regulator Filtering Results
Testing: Stepper motor test & stepper driver Test
The stepper motor contains 4 stator wires. Each wire belongs in separate groups: group A and
group B. A0 and A1 are linked internally, and B0 and B1 are linked internally as well. To check
the working condition of the stepper motor, a continuity measurement of each group was done.
By running the test, the health of the stepper motor was obtained. To check the working
condition of the stepper driver, a multimeter was used to measure output current and voltage of
the device. If the output measured did not correspond to the desired output, adjustments were
made by toggling the inbuilt potentiometer.
44
Software Testing and debugging
Tools
● Postman - an API testing software that enables sending HTTP requests with specified
data. These requests can be chained through scripting in order to test the functionality of
the server. These scripts are written with JavaScript.
● MeshLab - a 3D processing tool that can be used to visualize PCD files.
Overview
The software testing consisted of multiple parts. Automated tests ensured that the services were
working as expected and manual tests were performed to validate the output.
Automated tests
Postman tests were written to ensure that each route performed as expected. During
development, all tests were automatically run every time the software was updated. This ensured
that the updates did not break the current build. Testing each servers’ routes, detailed in the
documentation in previous sections, involved sending a request and validating the response. Each
route was tested on its own and then it was tested in relation to a use case. The Application
server events were tested in a similar manner. The automated testing plan that was used is
detailed here:
Routes
● API Testing
○ Application Server
■ “/”
Data in Request sent: N/A
Response Expected HTML in any form
■ “/scans”
Data in Request sent: N/A
Response Expected A JSON array of data that details
what scans are available.
■ “/scan/:scanid”
Data in Request sent: scanid for a given scan.
Response Expected A PCD file
45
■ “/snapshots”
Data in Request sent: N/A
Response Expected A JSON array of data that details
what snapshots are available.
■ “/snapshot/: snapshotsid”
Data in Request sent: snapshotsid for a given scan.
Response Expected A PCD file
○ Processing Server
■ “/scan”
data in request sent BAG file
response expected Status 200
extra validation Check if the PCD Database contains the
expected bag file
● Integration Testing
Manual Tests
● Motor driver step commands
Tests were conducted to make sure that the number of steps sent by the Raspberry Pi to
the stepper motor resulted in the correct amount of rotation. This was tested by polling
the rotary encoder for the current angular position of the motor shaft and correlating that
value to the command that was sent to the motor driver. Another approach was to use a
protractor and mark the expected degrees of movement. This marker on the moving
component was compared to the expected resulting position to see if the correct
movement was made.
● Timing of events
Tests were completed to ensure that all movements and data captures occurred exactly
when expected to. It was expected that the LiDAR would capture a frame every X degree
of rotation. To test this, angular position of the shaft was saved every time the LiDAR
46
captured a frame of data. This data was then reviewed to ensure that a frame was taken
every X degree.
● Point cloud result
We ensured that the point cloud output of the stitching algorithm represented the actual
room that was scanned. This was verified by comparing the actual room to the 3D point
cloud output that was viewed in the viewer. Different objects and features of the room in
the point cloud were observed and compared to the actual room to see if they are
relatively in the same area/distances.
6.2 Final results
Our final complete system consists of a LiDAR sensor mounted on a portable rotating platform
that creates a 360-degree point-cloud map of a room. A user is able to command the system to
begin the scanning process, send the data captured by the sensor to a server for processing, and
then view the resulting 3D map on a web viewer. The entire system is portably powered with
rechargeable Li-ion batteries.
Figure 6.2.1 RGB photo of Discovery Hall
Figure 6.2.2 3D scan of Discovery Hall
47
Figure 6.2.3 3D scan including plan view, and interior view
48
7. Bill of Materials (BOM)
Purpose Description Manufacturer Distributor Product ID Unit price Quantity Cost
Product Raspberry Pi 3 B+ Kit Raspberry Pi Amazon B07BCC8PK7 $79.99 2 $159.98
Product Raspberry Pi 4 B Basic Kit Raspberry Pi CanaKit
PI4-2GB-
K102-1 $59.95 2 $119.90
Product Miniature Slip Ring Prosper Adafruit 1195 $24.95 1 $24.95
Product Nema 14 Bipolar Motor StepperOnline
Stepperonli
ne 14HS10-0404S $9.52 1 $9.52
Product Stepper Motor Driver
Texas
Instruments Pololu DRV8825 $8.95 1 $8.95
Product
72T, GT2-2mm pulley, 8mm
bore SDP/SI SDP/SI
A6Z51M072D
F0608 $9.04 1 $9.04
Product
40T, GT2-2mm pulley, 5mm
bore SDP/SI SDP/SI
A6Z51M040D
F0605 $7.45 1 $7.45
Product
40T, GT2-2mm pulley, 6mm
bore SDP/SI SDP/SI
A6A51M040D
F0606 $15.00 1 $15.00
Product
145 tooth belt, GT2, 2mm
pitch, 6mm wide SDP/SI SDP/SI
A6R51M14506
0 $5.61 1 $5.61
Product
Encoder, 200 counts/turn, 6mm
shaft US Digital US Digital
S4T-200-236-
S-B $71.80 1 $71.80
Product
Encoder harness; #26 gauge,
4x1.25mm pitch US Digital US Digital
CA-MIC4-W4-
NC-1 $6.80 1 $6.80
Product
Samsung 18650 3500 mAh Li-
Ion Battary Samsung imrbatteries 1NR18650-35E $4.99 3 $14.97
Product
5V Step-Down Voltage
Regulator Polulu Polulu
D24V50F5(ite
m#2851) $14.95 2 $14.95
Product
Nitecore i4 (4 Bay Battery
Charger) Nitecore
18650Batter
y B00GODG3X0 $16.99 1 $16.99
Product Lithum Ion BMS board Vetco Vetco VUPN712 $10.95 1 $10.95
Product Metal platform parts
Siverson
Design Siverson
Design N/A $2621.65 1 $2621.65
Total
parts
Total
cost
21 $3,133.46
49
9. List of Industry Standards
IEEE 802 standards
The IEEE 802 is a family of IEEE standards that is restricted to networks carrying variable-sized
packets (Cell network data is transmitted in uniform units). The working groups of IEEE 802 are
802.2 (Logical Link Control), 802.3 (Ethernet), 802.11 (Wi-Fi), and 802.15 (Wireless Personal
Area Network) [includes 802.15.1 (BT) and 802.15.4 (Low-Rate Wireless PAN)]. The services
and protocols map to the data link and physical layers. The radio frequencies it covers range
from 2.4GHz, 5Ghz and up to 60 GHz.
IEEE 1394
IEEE 1394, High Performance Serial Bus, is an electronics standard for connecting devices to a
personal computer. IEEE 1394 provides a single plug-and-socket connection on which up to 63
devices can be attached with data transfer speeds up to 400 Mbps (megabits per second). The
working group under IEEE 1394 include USB, Firewire, SCSI and many more.
Inter-integrated Circuit (I2C) Protocol
A protocol intended to allow multiple “slave” digital integrated circuits (“chips”) to
communicate with one or more “master” chips. It is only intended for short distance
communications within a single device, and only requires two signal wires to exchange
information. It is being used for interfacing the MCU with the accelerometer in this project.
Laser Safety Standard
The MicroVision LiDAR our team is using for this Capstone project is classified as a Class 1
Laser Product as defined in IEC60825-1. A class 1 laser means that the laser is eye-safe under all
operating conditions. The beam produces less than 0.39 milliwatts.
Python 2
A programming language is a set of rules and keyword that enable a person to communicate and
control a computer. By following Python 2’s syntax rules, keywords, and structures, our team
was able to create the software, which the python interpreter was able to read execute to run our
system.
50
10. Acknowledgements
We would like to thank Roger Johnson, our industry advisor from MicroVision. Roger has been
incredibly helpful to our team for designing the mechanical platform that housed our electronics
and software. Roger has also guided and helped our team throughout phase 1 and 2 of our
Capstone project.
We would also like to thank Dr. Wayne Kimura, our faculty advisor. Dr. Kimura has helped our
team throughout phase 1 and 2 of Capstone by ensuring that our group was always on top of
everything, and by providing guidance and advice.
We would like to thank Selvan Viswanathan, Phil Kent, and Henry Baron from MicroVision for
supporting and advising our team through phase 1 and 2 of our Capstone Project.
Thank you to MicroVision for sponsoring our Capstone project.
51
11. Bibliography
“Algorithm Walkthrough for Tuning” Cartographer ROS Documentation, 5 Nov. 2018, google-
cartographer-ros.readthedocs.io/en/latest/algo_walkthrough.html.
Consumer LiDAR Product Family MEMS Based 3D LiDAR Engine. Consumer LiDAR Product
Family MEMS Based 3D LiDAR Engine, MicroVision, 2019. PRODUCT BRIEF
Gregersen, Erik. “Lidar.” Encyclopædia Britannica, Encyclopædia Britannica, Inc., 13 Oct.
2016, www.britannica.com/technology/lidar.
Neff, Todd. The Laser That's Changing the World: The Amazing Stories behind Lidar, from 3D
Mapping to Self-Driving Cars. Prometheus Books, 2018.
52
12. Appendices
1. Google Cartographer - high level system overview
(“Algorithm Walkthrough for Tuning”)
This diagram shows a high-level system view of Cartographer. Google open sourced this
algorithm back in 2016, but it still uses this algorithm in Google street view to map building
interiors. Cartographer provides real-time simultaneous localization and mapping (SLAM) in 2D
and 3D across multiple platforms and sensor configurations.
For our indoor 3D mapping system, we utilized two types of input sensors. The first input sensor
is a laser scan (provided by the MicroVision LiDAR sensor). The second input sensor is from the
rotary encoder, which provides the angular position. The laser scan data then passes through a
Voxel filter, performing a down-sampling of the dataset. The filtered scan data and the IMU
angular position are then put through a Pose Extrapolator which aids the algorithm in predicting
where the next scan should be inserted into the submap. This submap is created by passing the
sensor data through scan matching and a motion filter. These submaps are then combined into a
global map, creating a 3D point cloud of the environment.

More Related Content

What's hot

Mosfet Operation and Charecteristics.
Mosfet Operation and Charecteristics.Mosfet Operation and Charecteristics.
Mosfet Operation and Charecteristics.Rafsan Rafin Khan
 
Electronics 1 : Chapter # 07 : AC Analysis BJT
Electronics 1 : Chapter # 07 : AC Analysis BJTElectronics 1 : Chapter # 07 : AC Analysis BJT
Electronics 1 : Chapter # 07 : AC Analysis BJTSk_Group
 
EC6601 VLSI Design Memory Circuits
EC6601 VLSI Design   Memory CircuitsEC6601 VLSI Design   Memory Circuits
EC6601 VLSI Design Memory Circuitschitrarengasamy
 
Diversity Techniques in Wireless Communication
Diversity Techniques in Wireless CommunicationDiversity Techniques in Wireless Communication
Diversity Techniques in Wireless CommunicationSahar Foroughi
 
Data transmission and telemetry
Data transmission and telemetryData transmission and telemetry
Data transmission and telemetryslide rock
 
Embedded systems-unit-1
Embedded systems-unit-1Embedded systems-unit-1
Embedded systems-unit-1Prabhu Mali
 
POWER SWITCHING DEVICES
POWER SWITCHING DEVICESPOWER SWITCHING DEVICES
POWER SWITCHING DEVICESSadanandam4u
 
Pass Transistor Logic
Pass Transistor LogicPass Transistor Logic
Pass Transistor LogicDiwaker Pant
 
EC 2 lab manual with circulits
EC 2 lab manual with circulitsEC 2 lab manual with circulits
EC 2 lab manual with circulitsMurugan Dhandapani
 
Comparison between the FPGA vs CPLD
Comparison between the FPGA vs CPLDComparison between the FPGA vs CPLD
Comparison between the FPGA vs CPLDGowri Kishore
 
Transmission lines
Transmission linesTransmission lines
Transmission linesSuneel Varma
 
Microwave oscillator design
Microwave oscillator designMicrowave oscillator design
Microwave oscillator designImane Haf
 
Solid state devices
Solid state devicesSolid state devices
Solid state devicesAqib Mir
 
matlab code for channel estimation for ofdm
matlab code for channel estimation for ofdmmatlab code for channel estimation for ofdm
matlab code for channel estimation for ofdmGyana Ranjan Mati
 

What's hot (20)

Analog vlsi
Analog vlsiAnalog vlsi
Analog vlsi
 
Mosfet Operation and Charecteristics.
Mosfet Operation and Charecteristics.Mosfet Operation and Charecteristics.
Mosfet Operation and Charecteristics.
 
Ir sensor
Ir sensorIr sensor
Ir sensor
 
Pic18f458
Pic18f458Pic18f458
Pic18f458
 
Electronics 1 : Chapter # 07 : AC Analysis BJT
Electronics 1 : Chapter # 07 : AC Analysis BJTElectronics 1 : Chapter # 07 : AC Analysis BJT
Electronics 1 : Chapter # 07 : AC Analysis BJT
 
EC6601 VLSI Design Memory Circuits
EC6601 VLSI Design   Memory CircuitsEC6601 VLSI Design   Memory Circuits
EC6601 VLSI Design Memory Circuits
 
Diversity Techniques in Wireless Communication
Diversity Techniques in Wireless CommunicationDiversity Techniques in Wireless Communication
Diversity Techniques in Wireless Communication
 
Data transmission and telemetry
Data transmission and telemetryData transmission and telemetry
Data transmission and telemetry
 
Embedded systems-unit-1
Embedded systems-unit-1Embedded systems-unit-1
Embedded systems-unit-1
 
POWER SWITCHING DEVICES
POWER SWITCHING DEVICESPOWER SWITCHING DEVICES
POWER SWITCHING DEVICES
 
Impedance Matching
Impedance MatchingImpedance Matching
Impedance Matching
 
Stick Diagram
Stick DiagramStick Diagram
Stick Diagram
 
Pass Transistor Logic
Pass Transistor LogicPass Transistor Logic
Pass Transistor Logic
 
EC 2 lab manual with circulits
EC 2 lab manual with circulitsEC 2 lab manual with circulits
EC 2 lab manual with circulits
 
Comparison between the FPGA vs CPLD
Comparison between the FPGA vs CPLDComparison between the FPGA vs CPLD
Comparison between the FPGA vs CPLD
 
Power mosfet
Power mosfetPower mosfet
Power mosfet
 
Transmission lines
Transmission linesTransmission lines
Transmission lines
 
Microwave oscillator design
Microwave oscillator designMicrowave oscillator design
Microwave oscillator design
 
Solid state devices
Solid state devicesSolid state devices
Solid state devices
 
matlab code for channel estimation for ofdm
matlab code for channel estimation for ofdmmatlab code for channel estimation for ofdm
matlab code for channel estimation for ofdm
 

Similar to 3D Mapping with LiDAR - Report

3D Mapping with LiDAR
3D Mapping with LiDAR3D Mapping with LiDAR
3D Mapping with LiDAREric Feldman
 
Meier_ECET365_Manual_LI
Meier_ECET365_Manual_LIMeier_ECET365_Manual_LI
Meier_ECET365_Manual_LIjmeier72
 
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...IRJET Journal
 
Disadvantages Of Robotium
Disadvantages Of RobotiumDisadvantages Of Robotium
Disadvantages Of RobotiumSusan Tullis
 
Overview of IoT/M2M Capability
Overview of IoT/M2M CapabilityOverview of IoT/M2M Capability
Overview of IoT/M2M CapabilityALTEN Calsoft Labs
 
PPT on Weather Monitoring System-converted (1).pptx
PPT on Weather Monitoring System-converted (1).pptxPPT on Weather Monitoring System-converted (1).pptx
PPT on Weather Monitoring System-converted (1).pptxabhisheksinghcompute
 
Programming The Real World
Programming The Real WorldProgramming The Real World
Programming The Real Worldpauldeng
 
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...IRJET Journal
 
Intelligent traffic light controller using embedded system
Intelligent traffic light controller using embedded systemIntelligent traffic light controller using embedded system
Intelligent traffic light controller using embedded systemIRJET Journal
 
Development of Software for Estimation of Structural Dynamic Characteristics ...
Development of Software for Estimation of Structural Dynamic Characteristics ...Development of Software for Estimation of Structural Dynamic Characteristics ...
Development of Software for Estimation of Structural Dynamic Characteristics ...IRJET Journal
 
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...IRJET Journal
 
Embedded System Design for Iris Recognition System.
Embedded System Design for Iris Recognition System.Embedded System Design for Iris Recognition System.
Embedded System Design for Iris Recognition System.Lakshmi Sarvani Videla
 
Portfolio - Muhammad Ismail Sheikh
Portfolio - Muhammad Ismail SheikhPortfolio - Muhammad Ismail Sheikh
Portfolio - Muhammad Ismail SheikhM. Ismail Sheikh
 
Review on an object following wireless robot
Review on an object following wireless robotReview on an object following wireless robot
Review on an object following wireless roboteSAT Publishing House
 

Similar to 3D Mapping with LiDAR - Report (20)

3D Mapping with LiDAR
3D Mapping with LiDAR3D Mapping with LiDAR
3D Mapping with LiDAR
 
Meier_ECET365_Manual_LI
Meier_ECET365_Manual_LIMeier_ECET365_Manual_LI
Meier_ECET365_Manual_LI
 
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...
IRJET- Implementation of Cloud Robotics using Raspberry PI to Monitor Product...
 
1st preview
1st preview1st preview
1st preview
 
Disadvantages Of Robotium
Disadvantages Of RobotiumDisadvantages Of Robotium
Disadvantages Of Robotium
 
Overview of IoT/M2M Capability
Overview of IoT/M2M CapabilityOverview of IoT/M2M Capability
Overview of IoT/M2M Capability
 
PPT on Weather Monitoring System-converted (1).pptx
PPT on Weather Monitoring System-converted (1).pptxPPT on Weather Monitoring System-converted (1).pptx
PPT on Weather Monitoring System-converted (1).pptx
 
Programming The Real World
Programming The Real WorldProgramming The Real World
Programming The Real World
 
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...
IoT-based Autonomously Driven Vehicle by using Machine Learning & Image Proce...
 
Intelligent traffic light controller using embedded system
Intelligent traffic light controller using embedded systemIntelligent traffic light controller using embedded system
Intelligent traffic light controller using embedded system
 
Development of Software for Estimation of Structural Dynamic Characteristics ...
Development of Software for Estimation of Structural Dynamic Characteristics ...Development of Software for Estimation of Structural Dynamic Characteristics ...
Development of Software for Estimation of Structural Dynamic Characteristics ...
 
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...
IRJET- Autonomous Underwater Vehicle: Electronics and Software Implementation...
 
Embedded System Design for Iris Recognition System.
Embedded System Design for Iris Recognition System.Embedded System Design for Iris Recognition System.
Embedded System Design for Iris Recognition System.
 
project seminor
project seminorproject seminor
project seminor
 
Src 147
Src 147Src 147
Src 147
 
Portfolio - Muhammad Ismail Sheikh
Portfolio - Muhammad Ismail SheikhPortfolio - Muhammad Ismail Sheikh
Portfolio - Muhammad Ismail Sheikh
 
An 706
An 706An 706
An 706
 
TMW09_03F3_proof
TMW09_03F3_proofTMW09_03F3_proof
TMW09_03F3_proof
 
Le company presentation
Le company presentationLe company presentation
Le company presentation
 
Review on an object following wireless robot
Review on an object following wireless robotReview on an object following wireless robot
Review on an object following wireless robot
 

Recently uploaded

My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 

Recently uploaded (20)

My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 

3D Mapping with LiDAR - Report

  • 1. 1
  • 2. 2 1. Acronym Definitions 5 2. Introduction 6 2.1 Background and Motivation 6 2.2 Overview of proposed project 6 2.3 Deliverables 7 3. Description of Project – Tier 1 Level 8 3.1 Top-level diagrams and system flowcharts 8 Tier 1 Hardware System Overview 8 Li-ion Battery 8 Battery Management System (BMS) 9 Voltage Regulator 9 Microcontroller 9 Motor 9 MicroVision Lidar PSE-0400li3-101 Sensor 9 Tier 1 Software System Overview 10 Microcomputer 11 Processing Server 11 PCD Database on S3 11 Application Server 11 Web Viewer / Web App 11 3.2 Identification of problems and issues 12 4. Description of Project - Tier 2 Hardware 13 4.1 Identification and specifications of major components 13 3D Printed Battery Case 14 Battery Management System (BMS) 14 Voltage Regulator 15 Power PCB 16 Raspberry Pi 4 Model B 17 Stepper Motor 18 Stepper Driver 18 Rotary Encoder 19 MicroVision LiDAR PSE-0400li3-101 Sensor 20 Rotating Platform 21 4.2 Layout diagrams for specific components and subsystems 25
  • 3. 3 Battery Management System (BMS) 26 Stepper Driver 28 Lidar Scanner and Rotary Encoder 29 4.3 Explanation of how software interfaces with hardware 30 4.4 How hardware was fabricated and assembled 31 5. Description of Project - Tier 2 Software 32 5.1 Identification of software that will be used 32 5.2 System requirements to run software 33 Raspberry Pi 4 B 33 IDE 33 5.3 Software Systems Overview 34 5.4 Details of Software Systems 35 Application Server 35 Processing Server 36 Overview 36 Processing Script 36 API Endpoints 36 PCD Database 37 Overview 37 Rotary Encoder 38 LiDAR Sensor 38 Motor Driver 39 ROS Bag 39 Processing Server 40 Web App 41 41 6. Integration and Testing 42 6.1 Procedure for testing and debugging 42 Hardware Testing and debugging 42 Testing: BMS / Li-Ion charger 42 Testing: LiDAR sensor, voltage regulation and current 42 Testing: Switching frequency of the voltage regulator 42 Testing: Stepper motor test & stepper driver Test 43 Software Testing and debugging 44
  • 4. 4 Tools 44 Overview 44 Automated tests 44 Routes 44 Manual Tests 45 6.2 Final results 46 7. Bill of Materials (BOM) 48 9. List of Industry Standards 49 IEEE 802 standards 49 IEEE 1394 49 Inter-integrated Circuit (I2C) Protocol 49 Laser Safety Standard 49 Python 2 49 10. Acknowledgements 50 11. Bibliography 51 12. Appendices 52 1. Google Cartographer - high level system overview 52
  • 5. 5 1. Acronym Definitions ● API - Application Programming Interface ● AWS - Amazon Web Services ● BMS - Battery Management System ● CAD - Computer Aided Design ● EC2 - Elastic Compute Cloud ● IDE - Integrated development environment ● Li-ion - Lithium-ion ● LiDAR - Light Detection and Ranging ● MVIS - MicroVision ● OpenCV - Open Computer Vision ● PCD - Point Cloud Data ● ROS - Robot Operating System ● S3 - Simple Storage Service ● SLAM - Simultaneous Localization and Mapping ● TOF - Time of Flight
  • 6. 6 2. Introduction 2.1 Background and Motivation LiDAR (Light detection and Ranging) is a sensor that is able to use light to measure range. The system is able to pulse a laser at a scene and measure the distance to a certain object by using the roundtrip travel time and the speed of light. Although LiDAR was first used after the invention of the laser in 1960, the underlying concept has existed since 1938 when cloud heights were measured using pulses of light (Gregersen). Edward Hutchinson Synge, known as Hutchie, first discovered the concept of LiDAR by having an observer aim searchlights at the night sky, while having a reflector bounce light into a photodetector (Neff). By using a fixed angle, the altitude could be determined (Neff). LiDAR was first implemented on airplanes and used to map the ground (Gregersen). This is done by rapidly firing LiDAR pulses at the ground and collecting the pulse that returns upon reflection. GPS and an IMU (inertial mass units) sensor are also used to determine the position and orientation of the airplane (Gregersen). Our project utilizes stitching to create a 360-degree depth view of an interior space. The indoor room mapping system will be used by MicroVision to demonstrate to potential customers their new LiDAR’s capability. In addition, indoor room mapping can be used to generate maps of indoor environments to help people navigate, or for first responders to familiarize themselves with an environment they are en route to, among other uses. 2.2 Overview of proposed project Our Capstone project involved the design and implementation of a system where a user can capture 360° scans of an indoor environment in 3D using the MicroVision LiDAR sensor. The user can view the result in our custom-built website or their own point cloud viewer. Our industrial sponsor, MicroVision, has recently begun to market their new indoor LiDAR sensors and they wanted a system that they can use to show potential customers the capabilities of their new LiDAR. Our project included writing software to process, filter, stitch, and view the data. Our prototype processed the data from the sensor and displayed it on the website. A rotating platform was implemented to autonomously rotate the sensor 360° to capture data hands-free. A system to portably power the system was also designed. Our prototype is operated wirelesses through the ‘cloud’.
  • 7. 7 2.3 Deliverables ● Implemented a stitching algorithm to capture a static, time-averaged 360-degree depth view of an interior space, such as a room, using MicroVision’s Consumer LiDAR sensor. ● Processed the obtained data such that it can be viewed and manipulated on a laptop PC. ● Automated the capture of a 360-degree scan of the room through the use of a motion- controlled platform. ● Made the platform, sensor, battery, and any other computational hardware operate untethered.
  • 8. 8 3. Description of Project – Tier 1 Level 3.1 Top-level diagrams and system flowcharts Tier 1 Hardware System Overview Figure 3.1.1 Hardware System Overview Diagram The diagram in Figure 3.1.1 shows our Tier 1 block diagram for the design of our hardware system. The entire system is powered by Lithium-ion batteries (1NR18650-35E), which feed into the Battery Management System (VUPN712). The BMS (Battery Management System) is connected to a voltage regulator (D24V50F5), which supplies the required power to each device. The regulator powers the microcontroller and the LiDAR sensor. The stepper driver powers the stepper motor, and the driver is powered directly from the BMS board output. The microcomputer connects to the stepper driver to tell the motor when to start or stop. Li-ion Battery The power source consists of three series connected 18650 Li-ion rechargeable battery cells. Each cell produces approximately 3.6V and a capacity of 3.5Ah depending on discharge rate.
  • 9. 9 Battery Management System (BMS) The BMS is a circuit board that charges and discharges the Li-ion batteries safely. It is designed to have over-current, over-voltage protection if the load requested in the circuit exceeds greater than what the battery can handle. In addition, the BMS helps to check the balance of each cell and operates the battery with the required specification. The BMS is not being used to charge the batteries. A separate external charger is used for charging the batteries. Voltage Regulator The output voltage from the BMS is sent to two 5V fixed output buck converters (voltage regulators), each capable of producing 5A. One converter operates the Raspberry Pi 4, the other the lidar sensor. The output from the converters go to a custom circuit which filters switching converter noise, provides overvoltage protection and fusing. Microcontroller A Raspberry Pi 4 (PI4-2GB-K102-1) microcomputer is being used as central hub to control everything in the system. It will be responsible for starting and ending the scanning process. It controls the steps and direction of the stepper motor, provides a main interface for the lidar scanner, and communications with our cloud server to transmit all data captured and receive all commands from the website. Motor To control the direction the scanner is facing, a motor has been used to rotate the scanning platform to the desired angle. In order to place the platform to the requested position for the sensor to scan, the microcontroller is used to instruct and monitor the status of the motor. MicroVision Lidar PSE-0400li3-101 Sensor By emitting a short laser burst and collecting the round-trip time, the distance of the object is determined. By firing many lasers at a given object, a 3D representation of the object can be constructed. Figure 3.1.2 MicroVision LiDAR (“Consumer LiDAR Product...”)
  • 10. 10 Tier 1 Software System Overview Figure 3.1.3 Software Overview Flow Chart The Tier I software flowchart our system is shown in Figure 3.1.3. Our system requires software to be implemented on the Raspberry Pi 4 Model B, a processing server, an application server, a point cloud database, and a website. When the system has been powered up and the user presses the scan button, the system initializes and begins scanning. Data received is stored locally on the Raspberry Pi 4. Once scanning is complete, the data is transferred to the processing server. The processing server uses our custom stitching algorithm, to stitch together the data into a cohesive model, and then store the model in a database. At any point after processing, a client computer can view the point cloud (a collection of XYZ points in 3D space) in a web browser.
  • 11. 11 Microcomputer The microcomputer, such as a Raspberry Pi 4, runs software to enable the capturing of data from the LiDAR sensor. The microcomputer temporarily stores the data onboard until the scan is complete and then transmits it to the server. It is powered by the onboard power system. Processing Server The processing server is an AWS EC2 instance. EC2 is a service provided through Amazon that allows for a virtual server to enable cloud computing. Processing point clouds is computationally intensive; EC2 allows us to host our code on a Linux machine in the cloud with the computational power to do our processing. It has our stitching algorithm running that combines all the scan frames into one model. Once finished it posts the model to the PCD Database. PCD Database on S3 The database is be something like an AWS S3 service that stores the generated models in a PCD file format. Amazon’s S3 service allows for object storage in the cloud. It provides programmatic cloud file storage. Application Server This server relays information and commands from the Microcomputer to the web viewer and vice versa through a TCP connection on the server. The Application server also transfers data to and from the Web App, such as scan data and static files required to run the website. Web Viewer / Web App A 3D web-based display engine such as a Three.js has been implemented to display the processed PCD file, which is retrieved from the Application server. Three.js is a cross browser library that enables the viewing of 3D computer graphics in a web browser. The processed point cloud is shown in such a way that enables the viewer to move, rotate, and zoom within the generated point cloud. The viewer can be also oriented with a top-down perspective or a first-person perspective - giving a multiple perspective of the room. In addition, a depth shader has been introduced to more easily gauge depth in the point cloud. The Web Viewer is also used to send commands to the Microcomputer, such as, changing the resolution, changing the amount of rotations, giving the scan a name, and initiating the scan. The Web App displays the status of the scan.
  • 12. 12 3.2 Identification of problems and issues There are some issues that came up during the development and implementation of this project. Some issues include, but are not limited to: • Depth Data Stitching Initially our plan was to use a SLAM (simultaneous localization and mapping) algorithm to stitch our frames together. We were using a SLAM implementation that Google created called Cartographer. However, this approach proved too difficult for the amount of time we were given. We could not get the system configured in the way Cartographer expected. Instead we implemented our own stitching algorithm that combines frames based on the rotation of the device. A diagram of the inner workings of Cartographer is included in the appendix. • Motor Control Initially, we attempted to implement ramping (acceleration / deceleration) of the motor to more smoothly control our platform rotations. After much effort made to implement this feature, we decided to control our platform without ramping. Time spent on this feature was resulting in a block on progress, so we decided as a team to move forward with a constant speed. We believe the issue was that the Raspberry Pi could not send signals fast enough through the GPIO pins. Lastly, we experienced an issue in capturing depth data due to jitter introduced into the system by the motor. We were able to fix this by introducing a 0.5 second delay after each motor command. • BMS (Battery Management System) The BMS did not show consistent performance; sometimes it would turn on and sometimes it wouldn’t. This occurs whenever the batteries are replaced. BMS boards are typically designed to be permanently connected to the batteries, and this may have been the root cause of the issue. We fixed this after rigorous testing and research by connecting a push button in between the B+ and P+ terminals of the BMS as shown in figure 4.2.2. When batteries are initially inserted and before activating the power toggle switch, the pushbutton is pressed to reset and turn on the BMS. • Battery Case The first battery case that we tried implementing had contact issues that preventing the Li-ion batteries from have a proper connection to the metal battery terminals on the case. To overcome this issue, we designed our own 3D printed battery case that uses better contact points that are designed to be used with Lithium-ion batteries. In addition, the battery case included wire harnesses, a BMS holder, and a push button holder to make the routing neater.
  • 13. 13 4. Description of Project - Tier 2 Hardware 4.1 Identification and specifications of major components 4.1.1 Lidar Scanner components In order to have a working Lidar system, the major system components that are needed are in the list below. The specifications of each component were researched and checked to match our desired system build. Lithium-Ion Battery (Li-ion) Figure 4.1.1 Samsung 18650 Li-ion Battery Figure 4.1.1 shows the specification and part image of the battery cell used. The Samsung 18650 is a Lithium-ion battery chemistry that is 18.5mm wide and 65.25mm long. The Li-Ion battery chemistry is able to hold large amount of electrical charge. This particular model number: 1NR18650-35E is rated at 3500 mAh and capable of delivering 3.5 A for 1 hour at a nominal voltage of 3.6V. This voltage can go up to 4.2V. For this project, 3 of the cells were connected in series to create at total output voltage of 10.8-12.6V, depending on the battery charge, at 3.5A. This amount of energy is enough to power all system components for approximately one hour.
  • 14. 14 3D Printed Battery Case Figure 4.1.2 Battery Holder The battery holder shown above is designed and printed to hold three 18650 batteries connected in series. This battery holder was designed to allow easy replacement of the batteries, and to provide good electrical contact with the battery terminals. The BMS senses the voltage from each battery cell, and the cells are connected in series. There is a cavity to locate the BMS board, and there is also a wire harness to route power cables cleanly. This allowed us to easily connect the batteries in parallel to the BMS board. There is also a holder for a push button that acts as a reset switch for the BMS. Each time the batteries are replaced, the push button must be pressed to reset the BMS. Battery Management System (BMS) Figure 4.1.3 BMS Figure 4.1.3 depicts the BMS used in our system. It is a 3S BMS, which means that it is made for connecting three batteries in series. This matches our spec of three Li-ion batteries. This BMS also has a limit of 8A on the output current. This matches our specification with some additional headroom, since our system will only need around 5A. This BMS also has various
  • 15. 15 circuit protections, such as over-current protection, short circuit protection, and over- discharge protection. BMS quiescent current during operation is approximately ~50𝜇A. Voltage Regulator Figure 4.1.4 5V, 5A Step-Down Voltage Regulator The step-down voltage regulator shown in Figure 4.1.4 is used to lower output voltages to ensure that each device in the system receives the corresponding voltage, since our batteries output around 11.1V, and the rest of the system requires a lower voltage. This is a DC-DC switching buck convertor that operates at 90% efficiency. The 5V voltage regulator can accept an input voltage up to 38V. Two voltage regulators were used: one for the LiDAR sensor and one for our microcomputer (Raspberry Pi). An outputted 5V connects to each device. Note that the 5V regulator can output a continuous current of 5A maximum. This is not a problem, however, because the current load for the regulators does not surpass 5A. Two low-pass filters were used for the outputs of each voltage regulator. This is to reduce noise that may be induced by the voltage regulators.
  • 16. 16 Power PCB Figure 4.1.5 Power PCB The PCB shown in Figure 4.1.5 was designed to house two voltage regulators, two Pi filters, two fuses and two Zener diodes. The PCB is 1.9 inches by 1 inch. It can be reduced to 0.8 inches width, if desired, because there is an extra 0.2 inches at the top of the board can be sheared off. There are keep out and restricted areas around each screw hole to prevent the copper ground plane from touching the screws. A single layer ground plane is on the bottom of the board. The trace width is 24 mils, which is calculated using a current of 3.3 A through each trace and a trace thickness of 2 oz/ft2 . There are three text points: GND, voltage regulator 1, and voltage regulator 2. The voltage regulators are mounted on top of this board using the two top screw holes and the female four-pin headers at the bottom of the board. The input connects to the board at the right angle, JST four pin header at the top right of the board. The two-right angle, JST 2 pin headers at the top of the board are the outputs of the board. They connect to the LiDAR sensor and the Raspberry Pi.
  • 17. 17 Raspberry Pi 4 Model B Figure 4.1.6 Raspberry Pi 4 Model B The Raspberry Pi 4 Model B (PI4-2GB-K102-1) acts as the central controller for the LiDAR scanner system. It is responsible for coordinating the movement of the stepper motor, direction and speed, gathering LiDAR scan data from MicroVision LiDAR Sensor and it is responsible for sending lidar data to a server for further processing. To achieve all of the above tasks, the Raspberry Pi 4 comes with 4 I/O ports and onboard systems. As shown in Figure 4.1.6, the board is powered by an Arm Cortex- A72 processor. This particular processor is a 64-bit architecture, capable of running up to 1.5 GHz. It is backed by a 2GB RAM, adequate enough to run all of the required processes needed for the working lidar scanner. Furthermore, the board comes with 2x USB 2.0 port and 2x USB 3.0, providing us 4x ports to connect our lidar sensor and any additional USB hardware required for the operation of the system. The LiDAR sensor should be used with the USB 3.0 ports for best performance. To store the collected data sent by the sensor, the Raspberry Pi 4 Model B has a built in SD card slot for storing large amounts of data. Upon completing the scan, the stored data is sent to a remote server for processing. This is done with the onboard 2.4 GHz/ 5 GHz, 802.11 WIFI chip. The IEEE 802.11.b/g/n/ac standard transfers files at max speed of 1300 Mb/s. To control and monitor external circuits, such as the encoder and stepper motor, the board comes with 40 GPIO pins. These pins can be configured to read/write pulse width modulation, digital and interrupt signals.
  • 18. 18 Stepper Motor Figure 4.1.7 Nema 14 Stepper Motor To get a precise scan, a stepper motor (14HS10-0404S) is used to rotate the lidar sensor. The type of motor chosen is a stepper motor. Stepper motors are generally used in a variety of applications where precise position control is desirable, and the cost or complexity of a feedback control system is unwarranted. The Nema 14 is a hybrid bipolar stepping motor that has a 1.8° step angle (200 steps/revolution). Each phase draws 500 mA at 12V, allowing for a holding torque of 1 kg- cm (14 oz-in). The motor has four color-coded wires terminated with bare leads: black and green connect to one coil; red and blue connect to the other. It can be controlled by a pair of suitable H- bridges (one for each coil). Stepper Driver Figure 4.1.8 DRV8825 Stepper Driver To control the speed and direction of the stepper motor, a stepper motor driver is required. The DRV8825 is a stepper driver capable of operating the stepper driver for our system. It is within the required voltage and current for the motor to work with. The Chip is capable of delivering up to 2.5A of current, well enough to meet the maximum required current of the stepper motor. In addition, the DRV8825 can operate at 1/2, 1/4, 1/16, and 1/32 microsteps. Microstepping control divides each full step into smaller steps to help smooth out the motor’s rotation, especially at slow speeds. For example, a 1.8-degree step can be divided up to 32 times, providing a step angle of 0.05 degrees (1.8 ÷ 32), or 6400 microsteps per revolution.
  • 19. 19 Figure 4.1.8.1 Step Resolutions Our design allows us to change the step resolution on our interconnect PCB (the PCB shown in figure 4.1.11) to get the best performance. The step resolution can be changed manually by moving a jumper found on the interconnect PCB. Table 4.1.8.1 shows several possible step resolutions. Rotary Encoder Figure 4.1.9 Rotary Encoder To ensure the accuracy of the scan, a rotary encoder (S4T-200-236-S-B) shown in Figure 4.1.9 is used. In the event of step loss from the stepper motor, a rotary encoder is used to confirm the position of the real stepper motor with the expected position from the software. If the system detects the disagreement between the two readings, it assumes an improper scan has been executed and restarts the system for a new scan.
  • 20. 20 The encoder to used is SRT Miniature Optical Shaft Encoder. Optical encoders use light instead of contacts to detect position, so they are inherently free from contact wear. For our system, a 200 CRP encoder is used. Cycles Per Revolution (CPR) are the measurement of the accuracy of the encoder. One complete rotation of the encoder shaft is 360 mechanical degrees. A 200 CPR encoder can provide 200, 400 or 800 positions per revolution depending on whether x1, x2 or x4 quadrature decoding is done. MicroVision LiDAR PSE-0400li3-101 Sensor Figure 4.1.10 MicroVision LiDAR The figure above shows MicroVision’s MEMS Based 3D LiDAR Engine, which is the sensor utilized in our system. This sensor’s data was used to construct 360-degree 3D layouts of rooms, but there are many more future possibilities in AI and machine learning. Constructing 360 3D scans is just the first step for future use. (Consumer LiDAR Product...)
  • 21. 21 Rotating Platform The rotary platform is intended to hold all of the required hardware items to operate our indoor LiDAR mapping system. MicroVision designed a mechanical platform for this project based on our specific requirements. Figure 4.1.11 shows the front view and Figure 4.1.12 shows the side view of the rotating platform. The diagram below is intended to show the mechanical connections between components. Figure 4.1.11 Front View of the Rotating Platform The platform used a stepper motor to rotate the top platform using a timing belt pulley mounted on an 8mm shaft. The top rotating platform holds the MicroVision LiDAR sensor, Raspberry Pi 4, and voltage step-down board. The bottom stationary platform holds the battery module, stepper motor, stepper motor driver, encoder and timing belt. The two platforms are connected electrically by the use of slip ring. The stepper motor driver is mounted onto the Interconnect PCB, which was designed MicroVision. This PCB also has a connector that is connected to the slip rings. Any connection made between the top and bottom platforms will pass through this connector and through the slip rings. Since the platform is self-contained and designed to be untethered it can continuously rotate; there are no mechanical stops, since our system is closed loop. The platform uses spacers to implement a multi-floor design, making it versatile if more floors are needed. Threaded rods running through the spacers and nuts hold the platform floors together. Figure 4.1.12 shows the GT2 timing belt. The GT2 timing belt is used to synchronize the motion control, which includes the stepper motor, rotary encoder and rotating top platform. By using a 1:1.8 gear ratio, the speed and direction of the rotating platform can be set. The stepper motor requires 200 steps to make 1 full rotation since each step is 1.8° (200 x 1.8° = 360°). The Stepper
  • 22. 22 motor is connected to a rotating base with a 1:1.8 geared pulley. The turn table therefore requires 200*1.8 = 360 full steps to rotate 1 full turn, and 60 steps for a 60-degree motion. If the 1/32 microstep resolution is selected on the stepper driver, one full turn of the top rotating platform requires: 360*32 = 11520 microsteps. Figure 4.1.12 Side View of the Rotating Platform
  • 23. 23 Figure 4.1.13 Constructed Platform from Design (First Iteration) The above image in Figure 4.1.13 shows a prototype of the platform without the top portion of it. This prototype was built by MicroVision. It was still under development at the time, and the final version consisted of metal parts.
  • 24. 24 Figure 4.1.14 Final Constructed Platform from Design The image in Figure 4.1.14 shows the final build of the complete system. The platform was designed and built by MicroVision.
  • 25. 25 4.2 Layout diagrams for specific components and subsystems Figure 4.2.1
  • 26. 26 Figure 4.2.1 outlines the complete wiring diagram for the lidar scanner system. Each hardware component is interconnected with each other through the pins found on the device. The above diagram requires key components and devices such as the battery management system (BMS), voltage regulator, stepper driver, rotary encoder, Raspberry Pi and a LiDAR scanner to construct the system. In addition, passive components such as inductors and capacitors are also used. Since each component and device play an important role in the overall function of the system, key components and devices are described in the following section. Battery Management System (BMS) Figure 4.2.2 Battery Management System (BMS) Figure 4.2.2 shows the input and output diagram for the BMS. The battery cells are connected in series to provided power to the load. But each cell’s voltage is an analog input to a difference amplifier on the BMS board. The BMS monitors each battery’s performance under discharge regardless of their position in the series chain. The positive and negative terminals of the batteries are connected to the B-inputs. B+ is the positive input of the first battery, B1 is the negative input of the first battery and the positive input of the second battery, and so on. P+ and P- are the terminals that the BMS outputs through. In addition, there is a push button linking B+ and P+, and a switch connected to P+.
  • 27. 27 Power PCB Figure 4.2.3 Power PCB Circuit Diagram Figure 4.2.4 Power PCB Eagle Schematic
  • 28. 28 Figure 4.2.3 shows the diagram of the components within the Power PCB. Figure 4.2.4 is the Eagle schematic of the PCB. The power PCB contains two voltage regulators, two Pi filters, two fuses, and two Zener diodes. The Voltage regulators come with 5 pins; Enable, Vin, VinGND, Vout, VoutGND. The voltage regulators are connected to the output of the battery management system (BMS) through the pin Vin (2) and VinGND (3). The output of the step-down voltage is connected via the pin Vout (6) and VoutGND (4). The En pin gives the option to switch between an on state and a low-power state. The En pin can be disconnected, as it is in our current system, if this feature is not desired. Each voltage regulator is connected to a low-pass filter, fast-blow resettable fuse, and 5.6V Zener diode. The low pass filter is a Pi type filter and it is used to reduce noise in the form of voltage fluctuations from the voltage regulators. The peak to peak voltage of the voltage regulators originally was 58.2 mV (with a switching frequency of 151.5 kHz). Our Pi filter was designed with a cutoff frequency of 3.63 kHz. After filtering the regulators, the output voltage dropped to 2.77 mV (Vp-p). The fast-blow fuse is used for overcurrent protection, and the Zener diode is used for overvoltage protection. Stepper Driver Figure 4.2.5 DRV8825 Stepper Driver Figure 4.2.5 shows the circuit diagram of the DRV8825 integrated motor driver. The DRV8825 comes with 16 pins. Pins from 1 to 8 control properties of the stepper driver. Pins in this group include: Enable, Mode select, sleep, reset, step and direction pins. Step, direction and enable pins
  • 29. 29 are connected to the GPIO pins on the Raspberry Pi 4. To control the step resolution, pins 2,3,4 for mode select are also connected to the GPIO pins. The output of the DVR8825 is located in pin 9-16. This pin sends power to the motor driver, 4 ports for stepper motor stator cables and ports for stepper motor power. Pins A1, A2, B1, B2 connect to the stepper motor’s four stator cables. The DRV8825 can be powered with a supply voltage between 8.2 and 45 V and is capable of providing an output current up to 2.5A full-scale. To reduce voltage ripples, a 100uF capacitor is required to be placed between pins 15 and 16. To reduce power consumption in the idle state, a low-power sleep mode is included by toggling the enable pin. Lidar Scanner and Rotary Encoder Figure 4.2.6 MicroVision Lidar Scanner Figure 4.2.6 show the connections to the MicroVision PSE-0400li3-101. The LiDAR contains three input/output pins that used for building our desired system. Pins Vin and GND are the input voltage for lidar unit. The sensor requires an input voltage of 5V at 2 Amps. The input voltage is supplied from the voltage regulator to the Vin and GND pins. To receive the output values of the scan from the lidar unit, the USB port of the lidar is connected to the USB port of the Raspberry Pi 4 Model B. The USB protocol used to transfer data is USB. 3.0. It contains sufficient bandwidth to transfer data between the two devices. To ensure scans are done correctly, the rotary encoder chip is also connected to the GPIO pins of the Raspberry Pi 4 Model B. The rotary encoder is linked mechanically to the stepper driver and provides constant feedback to Raspberry Pi to ensure required steps are being executed properly. Pins A+, A-, B+, B- are connected to the GPIO pins. Pins Vin and GND are connected to the Raspberry Pi’s onboard power. Pins 5V and GND are used.
  • 30. 30 4.3 Explanation of how software interfaces with hardware Our Lidar scanner system required an interface between hardware and software to work properly. Core components such as the stepper driver chip (DRV8825) and MicroVision LiDAR sensor all require the provision from the onboard software to monitor and instruct each of the components. To achieve such tasks, the GNU/Linux operating system is used. Figure 4.3.1 Layer diagram of software and hardware interfaces in GNU/Linux operating system Figure 4.3.1 outlines the system architecture of the GNU/Linux operating system. The operating system breaks the execution of programs into two major categories, kernel space and user space. The kernel space is reserved for the most important processes. When a user runs an application or tool, that application or tool executes in what is called user space. Since applications can come from a variety of sources, some may be poorly developed or originate from unknown sources. By running these applications separate from kernel space, it prevents programs from tampering with the kernel resources and causing the system to panic/crash. The hardware consists of all peripheral devices (RAM/ HDD/ CPU, I/O) and is accessed by the kernel. In order to get the most out of the system, core parts of the lidar system are controlled by programs running in the kernel space. This ensures that the required process is executed on demand by the CPU. Other processes such as post processing that do not require a direct access to the hardware are executed in the user space. The different levels of privileges for each process guarantees the execution of important processes over less important processes.
  • 31. 31 4.4 How hardware was fabricated and assembled Each component was bought and then integrated each into our Power PCB. All chips were surface mounted onto the PCB. The PCB is designed to provide intermediary connections between surface mounted chips. Two low pass Pi filters were built and soldered directly onto the PCB where the outputs of the voltage regulators are. The battery holder shown in figure 4.1.2 was placed on the platform and used to hold the batteries in place. Input and output connectors were soldered onto the PCB and used to connect to the PCB.
  • 32. 32 5. Description of Project - Tier 2 Software 5.1 Identification of software that will be used ● Frontend Website Technologies ○ HTML/CSS/JS ■ Vue.js - A JavaScript framework, that helps modularize web components. ■ Typescript - A strictly typed programming language that compiles to JavaScript. ■ Webpack - A JavaScript module bundler. ■ SCSS/SASS - a stylesheet language that compiles to CSS ○ Node.js ■ Express - A web application framework. ■ Socket.io - A library that abstracts away the socket interfaces. Quickly enables TCP socket events. ● Amazon Web Services (AWS) ○ Simple Storage Service (S3) - an object storage service. ○ Amazon Elastic Compute Cloud (EC2) - A service that provides cloud servers. ● Point Cloud Processing and sensor interfaces ○ Python ■ ROS ■ RPi.GPIO ■ RpiMotorLib ■ OpenCV
  • 33. 33 5.2 System requirements to run software Raspberry Pi 4 B A Linux distribution such as Ubuntu or Raspbian was installed on the Raspberry Pi to run the onboard code. A Python 2 interpreter was installed to run the program. Python 2 is a cross platform language that can be interpreted on different boards. If another microcomputer was required to replace the Pi, porting the code can be easily done. Only modifying the GPIO library was needed because the board uses I/O pins to communicate with the motor driver and rotary encoder. In order to optimize efficiency and speed, unneeded processes running on the Pi are disabled, and our program is set to auto boot on startup. IDE Various IDEs (Integrated development environment) are utilized to facilitate the writing of Python code. The IDEs used are: Visual Studio Code, Atom, and PyCharm. IDEs are preferred over the terminal because they allow vertical integration of writing code, testing, version control, formatting, and more.
  • 34. 34 5.3 Software Systems Overview Our system consists of 5 major components: The Processing Server, the Application Server, the Web Viewer (Web App), and the Microcomputer (see Fig. 5.3.1). The web viewer initiates a scan by emitting an event. the Microcomputer receives the event and begin scanning the room. The Microcomputer sends the resulting scans to the Processing server. The Processing server stitches together the LiDAR scans and stores the final model in the S3 storage. Finally, a user uses the Web Viewer to view the models. Figure 5.3.1 Software System Architecture
  • 35. 35 5.4 Details of Software Systems Application Server Overview The Application Server achieves multiple purposes. It facilitates communication between the WebApp and the Microcomputer, serves the WebApp to a client browser, provides access to the saved PCD files, and provide metadata about what each PCD file represents. Event Handler The Event Handler facilitates dynamic communication between the WebApp and the Microcomputer and retains information about what transpired. This enables settings for the Scanner to be adjusted by the WebApp. Retaining the metadata of the scan is also important because the PCD database only retains the name of a scan. Otherwise, information regarding the scan’s resolution, rotation count, and elapsed time would be lost. That data is valuable for evaluating the performance of the system and should therefore be stored. Documentation for each event is given in next section. API Endpoints API (Application Programming Interface) Endpoints are exposed to the WebApp for passing static data, such as PCD files, HTML and other static files, and metadata of scan. Figure 5.4.1 Application Server Architecture
  • 36. 36 Processing Server Overview The processing server is responsible for taking the BAG file generated by the Microcomputer and outputting a stitched model of the room in PCD format. The microcomputer passes the bag file to an API endpoint, which passes it the initializer process that begins the stitching process. When stitching is completed, it passes the PCD to the PCD Database. Processing Script This is a script that starts up a stitching process node, and then begins the data transfer into itself, from the S3 database. Once the above processes have completed, the final PCD file is added back into the S3 Database. API Endpoints API (Application Programming Interface) Endpoints are exposed to the Microcomputer for passing the BAG file that contains the scan information. Figure 5.4.2 Processing Server Architecture
  • 37. 37 PCD Database Overview The PCD Database uses Amazon Simple Storage Service (S3) for file storage. It stores final stitched PCD files and the Bag files (a file format type that stores message data) that are processed by the server. The Processing server uploads the BAG and PCD Files, and the Application server downloads the PCD files. The structure of the data is as follows: Figure 5.4.3 PCD Database
  • 38. 38 Rotary Encoder Figure 5.4.4 Rotary Encoder The rotary encoder flowchart in Figure 5.4.4 shows how information is gathered from the rotary encoder and made available to be used. The encoder interfaces with the Raspberry Pi through a GPIO library called RPi.GPIO. The rotary encoder code acts as a publisher, publishing the current angular position of the rotary encoder. After initialization, the rotary encoder software waits until it receives a pulse on a I/O pin. After receiving a pulse, a subroutine increments the pulse counter, and convert the total number of pulses to degrees. A global angular position variable is updated and published. LiDAR Sensor Figure 5.4.5 LiDAR Sensor The LiDAR sensor flowchart in Figure 5.4.5 demonstrates how LiDAR depth information is gathered and outputted. The LiDAR sensor code acts as a publisher, publishing depth frames from the LiDAR. After initialization, OpenCV (Open Computer Vision) is used to connect to the sensor and open a connection. The software is then in a waiting state until a capture request is received. After receiving a capture request, the LiDAR depth data is saved. The depth frame is then processed and converted to Cartesian coordinates. Lastly, the depth frame is published.
  • 39. 39 Motor Driver Figure 5.4.6 Motor Driver The motor driver flowchart in Figure 5.4.6 demonstrates how the motor driver resulted in platform rotation. The motor driver code interfaces between the Raspberry Pi and the motor driver board through a library called RpiMotorLib. After initialization, the software establishes a connection to the driver board through the I/O pins on the Raspberry Pi. The software then enters a waiting state until a request to move the motor has been received. After a request is received, the Raspberry Pi communicates to the driver board the number of steps, acceleration, speed, and direction in which to move. ROS Bag Figure 5.4.7 ROS Bag The ROS Bag flowchart in Figure 5.4.7 shows the process of the bag file creation. ROS (Robot Operating System) is the framework our project utilized to compile all of our sensor data together. The bag file acts as a subscriber to the sensor nodes, which then publishes data to it. The chart above consists of two main publisher nodes: rotary encoder and LiDAR. These two nodes publish their data, which is then compiled into a ROS bag and sent to the S3 database.
  • 40. 40 Processing Server Figure 5.4.8 Processing Server The processing server flow chart shows the end to end process from a raw Bag file to a complete PCD file. After receiving a bag file, the stitching algorithm is initialized with the metadata of the scan. The algorithm is then executed to convert the Bag file to into a point cloud data set in the form of a PCD file. The completed point cloud PCD file is then exported to the S3 database.
  • 41. 41 Web App Overview The web App is the main interface between the user and the hardware. A user is able to send the hardware commands, see what state the scanning process is in, and view all previously generated scans in the web viewer. The important screens are detailed here. Home This page gives an overview of the project. A user can learn more about the project, begin the scanning process, or view one of the most recent scans. LiDAR Scanning This page allows the user to send commands to the hardware. They can adjust the resolution, name the file, and set the number of rotations. The state and progress of the scan is also displayed. Point cloud viewer This page displays a generated point cloud to the user. They can rotate, scale, and move the scan using the keyboard and mouse. Metadata is also displayed. Figure 5.4.9 Web App
  • 42. 42 6. Integration and Testing 6.1 Procedure for testing and debugging Hardware Testing and debugging Testing: BMS / Li-Ion charger Tests were run on the BMS and the Li-ion charger. First, the batteries were connected to the BMS and connected the output of the BMS to a dummy load (resistors). An oscilloscope and a multimeter to measure the voltage across the load to make sure that the BMS is working as intended. The batteries were discharged during the previous test, and then the charger was tested. This was done simply by plugging the batteries into the charger and checking that each terminal of the charger works and is successfully charging the batteries. Testing: LiDAR sensor, voltage regulation and current The LiDAR sensor was tested to be operational. Once the battery power was attached to the system, the sensor was checked for functionality. Test scans on the sensor were taken prior to attaching it to the system. Next, the voltage regulator was tested by connecting it to a resistor dummy load and measuring the voltage across it and the current through it. Figure 6.1.1 shows the results of the voltage regulator test. The measurements were taken via an oscilloscope and/or multimeter. Figure 6.1.1 Voltage Regulator Performance Test Testing: Switching frequency of the voltage regulator The 5V voltage regulator that was used is of the switching type. As a result, there is additional noise induced by the regulator that needed to minimize. This noise can cause problems in the operation of the LiDAR sensor and the Raspberry Pi because the input voltage may fluctuate beyond what the two devices need. Therefore, a low-pass filter was added to the output of the voltage regulator. Prior to that, two devices were tested without a filter to see if it still worked. When designing the filter, the switching frequency of the regulator needed to be determined. This was accomplished by connecting a load resistor to the regulator’s output and analyzing the resulting output on an oscilloscope. Then the filter was designed to have a suitable cutoff frequency. Figure 6.1.2 shows the results of the low-pass filtering.
  • 43. 43 Figure 6.1.2 Voltage Regulator Filtering Results Testing: Stepper motor test & stepper driver Test The stepper motor contains 4 stator wires. Each wire belongs in separate groups: group A and group B. A0 and A1 are linked internally, and B0 and B1 are linked internally as well. To check the working condition of the stepper motor, a continuity measurement of each group was done. By running the test, the health of the stepper motor was obtained. To check the working condition of the stepper driver, a multimeter was used to measure output current and voltage of the device. If the output measured did not correspond to the desired output, adjustments were made by toggling the inbuilt potentiometer.
  • 44. 44 Software Testing and debugging Tools ● Postman - an API testing software that enables sending HTTP requests with specified data. These requests can be chained through scripting in order to test the functionality of the server. These scripts are written with JavaScript. ● MeshLab - a 3D processing tool that can be used to visualize PCD files. Overview The software testing consisted of multiple parts. Automated tests ensured that the services were working as expected and manual tests were performed to validate the output. Automated tests Postman tests were written to ensure that each route performed as expected. During development, all tests were automatically run every time the software was updated. This ensured that the updates did not break the current build. Testing each servers’ routes, detailed in the documentation in previous sections, involved sending a request and validating the response. Each route was tested on its own and then it was tested in relation to a use case. The Application server events were tested in a similar manner. The automated testing plan that was used is detailed here: Routes ● API Testing ○ Application Server ■ “/” Data in Request sent: N/A Response Expected HTML in any form ■ “/scans” Data in Request sent: N/A Response Expected A JSON array of data that details what scans are available. ■ “/scan/:scanid” Data in Request sent: scanid for a given scan. Response Expected A PCD file
  • 45. 45 ■ “/snapshots” Data in Request sent: N/A Response Expected A JSON array of data that details what snapshots are available. ■ “/snapshot/: snapshotsid” Data in Request sent: snapshotsid for a given scan. Response Expected A PCD file ○ Processing Server ■ “/scan” data in request sent BAG file response expected Status 200 extra validation Check if the PCD Database contains the expected bag file ● Integration Testing Manual Tests ● Motor driver step commands Tests were conducted to make sure that the number of steps sent by the Raspberry Pi to the stepper motor resulted in the correct amount of rotation. This was tested by polling the rotary encoder for the current angular position of the motor shaft and correlating that value to the command that was sent to the motor driver. Another approach was to use a protractor and mark the expected degrees of movement. This marker on the moving component was compared to the expected resulting position to see if the correct movement was made. ● Timing of events Tests were completed to ensure that all movements and data captures occurred exactly when expected to. It was expected that the LiDAR would capture a frame every X degree of rotation. To test this, angular position of the shaft was saved every time the LiDAR
  • 46. 46 captured a frame of data. This data was then reviewed to ensure that a frame was taken every X degree. ● Point cloud result We ensured that the point cloud output of the stitching algorithm represented the actual room that was scanned. This was verified by comparing the actual room to the 3D point cloud output that was viewed in the viewer. Different objects and features of the room in the point cloud were observed and compared to the actual room to see if they are relatively in the same area/distances. 6.2 Final results Our final complete system consists of a LiDAR sensor mounted on a portable rotating platform that creates a 360-degree point-cloud map of a room. A user is able to command the system to begin the scanning process, send the data captured by the sensor to a server for processing, and then view the resulting 3D map on a web viewer. The entire system is portably powered with rechargeable Li-ion batteries. Figure 6.2.1 RGB photo of Discovery Hall Figure 6.2.2 3D scan of Discovery Hall
  • 47. 47 Figure 6.2.3 3D scan including plan view, and interior view
  • 48. 48 7. Bill of Materials (BOM) Purpose Description Manufacturer Distributor Product ID Unit price Quantity Cost Product Raspberry Pi 3 B+ Kit Raspberry Pi Amazon B07BCC8PK7 $79.99 2 $159.98 Product Raspberry Pi 4 B Basic Kit Raspberry Pi CanaKit PI4-2GB- K102-1 $59.95 2 $119.90 Product Miniature Slip Ring Prosper Adafruit 1195 $24.95 1 $24.95 Product Nema 14 Bipolar Motor StepperOnline Stepperonli ne 14HS10-0404S $9.52 1 $9.52 Product Stepper Motor Driver Texas Instruments Pololu DRV8825 $8.95 1 $8.95 Product 72T, GT2-2mm pulley, 8mm bore SDP/SI SDP/SI A6Z51M072D F0608 $9.04 1 $9.04 Product 40T, GT2-2mm pulley, 5mm bore SDP/SI SDP/SI A6Z51M040D F0605 $7.45 1 $7.45 Product 40T, GT2-2mm pulley, 6mm bore SDP/SI SDP/SI A6A51M040D F0606 $15.00 1 $15.00 Product 145 tooth belt, GT2, 2mm pitch, 6mm wide SDP/SI SDP/SI A6R51M14506 0 $5.61 1 $5.61 Product Encoder, 200 counts/turn, 6mm shaft US Digital US Digital S4T-200-236- S-B $71.80 1 $71.80 Product Encoder harness; #26 gauge, 4x1.25mm pitch US Digital US Digital CA-MIC4-W4- NC-1 $6.80 1 $6.80 Product Samsung 18650 3500 mAh Li- Ion Battary Samsung imrbatteries 1NR18650-35E $4.99 3 $14.97 Product 5V Step-Down Voltage Regulator Polulu Polulu D24V50F5(ite m#2851) $14.95 2 $14.95 Product Nitecore i4 (4 Bay Battery Charger) Nitecore 18650Batter y B00GODG3X0 $16.99 1 $16.99 Product Lithum Ion BMS board Vetco Vetco VUPN712 $10.95 1 $10.95 Product Metal platform parts Siverson Design Siverson Design N/A $2621.65 1 $2621.65 Total parts Total cost 21 $3,133.46
  • 49. 49 9. List of Industry Standards IEEE 802 standards The IEEE 802 is a family of IEEE standards that is restricted to networks carrying variable-sized packets (Cell network data is transmitted in uniform units). The working groups of IEEE 802 are 802.2 (Logical Link Control), 802.3 (Ethernet), 802.11 (Wi-Fi), and 802.15 (Wireless Personal Area Network) [includes 802.15.1 (BT) and 802.15.4 (Low-Rate Wireless PAN)]. The services and protocols map to the data link and physical layers. The radio frequencies it covers range from 2.4GHz, 5Ghz and up to 60 GHz. IEEE 1394 IEEE 1394, High Performance Serial Bus, is an electronics standard for connecting devices to a personal computer. IEEE 1394 provides a single plug-and-socket connection on which up to 63 devices can be attached with data transfer speeds up to 400 Mbps (megabits per second). The working group under IEEE 1394 include USB, Firewire, SCSI and many more. Inter-integrated Circuit (I2C) Protocol A protocol intended to allow multiple “slave” digital integrated circuits (“chips”) to communicate with one or more “master” chips. It is only intended for short distance communications within a single device, and only requires two signal wires to exchange information. It is being used for interfacing the MCU with the accelerometer in this project. Laser Safety Standard The MicroVision LiDAR our team is using for this Capstone project is classified as a Class 1 Laser Product as defined in IEC60825-1. A class 1 laser means that the laser is eye-safe under all operating conditions. The beam produces less than 0.39 milliwatts. Python 2 A programming language is a set of rules and keyword that enable a person to communicate and control a computer. By following Python 2’s syntax rules, keywords, and structures, our team was able to create the software, which the python interpreter was able to read execute to run our system.
  • 50. 50 10. Acknowledgements We would like to thank Roger Johnson, our industry advisor from MicroVision. Roger has been incredibly helpful to our team for designing the mechanical platform that housed our electronics and software. Roger has also guided and helped our team throughout phase 1 and 2 of our Capstone project. We would also like to thank Dr. Wayne Kimura, our faculty advisor. Dr. Kimura has helped our team throughout phase 1 and 2 of Capstone by ensuring that our group was always on top of everything, and by providing guidance and advice. We would like to thank Selvan Viswanathan, Phil Kent, and Henry Baron from MicroVision for supporting and advising our team through phase 1 and 2 of our Capstone Project. Thank you to MicroVision for sponsoring our Capstone project.
  • 51. 51 11. Bibliography “Algorithm Walkthrough for Tuning” Cartographer ROS Documentation, 5 Nov. 2018, google- cartographer-ros.readthedocs.io/en/latest/algo_walkthrough.html. Consumer LiDAR Product Family MEMS Based 3D LiDAR Engine. Consumer LiDAR Product Family MEMS Based 3D LiDAR Engine, MicroVision, 2019. PRODUCT BRIEF Gregersen, Erik. “Lidar.” Encyclopædia Britannica, Encyclopædia Britannica, Inc., 13 Oct. 2016, www.britannica.com/technology/lidar. Neff, Todd. The Laser That's Changing the World: The Amazing Stories behind Lidar, from 3D Mapping to Self-Driving Cars. Prometheus Books, 2018.
  • 52. 52 12. Appendices 1. Google Cartographer - high level system overview (“Algorithm Walkthrough for Tuning”) This diagram shows a high-level system view of Cartographer. Google open sourced this algorithm back in 2016, but it still uses this algorithm in Google street view to map building interiors. Cartographer provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations. For our indoor 3D mapping system, we utilized two types of input sensors. The first input sensor is a laser scan (provided by the MicroVision LiDAR sensor). The second input sensor is from the rotary encoder, which provides the angular position. The laser scan data then passes through a Voxel filter, performing a down-sampling of the dataset. The filtered scan data and the IMU angular position are then put through a Pose Extrapolator which aids the algorithm in predicting where the next scan should be inserted into the submap. This submap is created by passing the sensor data through scan matching and a motion filter. These submaps are then combined into a global map, creating a 3D point cloud of the environment.