SlideShare a Scribd company logo
1 of 1
Download to read offline
Visionᅠ&ᅠMotion – The frame throughput is
improved by leveraging hardware video encoding
to calculate optic flow using H.264. Motion data is
segmented into foreground and background
classes (Figure 4) to provide an additional input for
to the sensor fusion system.
Controlᅠ&ᅠTuning – In autonomous flight, Pl
controllers are used to keep the heading, altitude
and speed of the blimp constant relative to the
target. The controller step responses were tuned
during test flights (Table 1).
Conclusions – This blimp provides a hardware
and software platform to demonstrate autonomous
station keeping behaviour without relying on
additional ground-based equipment. Future work
will now refine the vision system and improve
control systems reliability.
Untethered
Autonomous Flight of
an Indoor Blimp
Andrew Mathieson
Supervisor: Toby Breckon
School of Engineering & Computer
Sciences, Durham University
Abstract – A prototype autonomous airship was
developed and tested indoors (Figure 1). Until
now, autonomous blimps have utilised additional
equipment on the ground to calculate their
position. This project instead uses visual and
inertia sensing to perform all processing on board.
Hardware – Low-cost commercial hardware
components were integrated into the blimp
payload. The key objectives for component
selection were weight minimisation and software
availability. The hardware comprises an Inertial
Measurement Unit (IMU), Pulse Width Modulated
(PWM) servo outputs and optically isolated motor
drivers (Figure 2).
Software – Estimates blimp pose relative to a
chessboard target using the camera, and fuses
camera pose estimates with data from the IMU.
Software is divided into four concurrent threads
(Figure 3) and runs on a Raspberry Pi 2.
MEng Research & Development Project April 2015
Figure 1 – Blimp in flight
Figure 2 – Hardware components
Figure 3 – Software architecture
Figure 4 – Captured frame and HSV representation of motion
Zeigler-Nichols
predicted 𝐾 𝑃 𝑦𝑎𝑤
Harriot/Nyquist
predicted 𝐾 𝑃 𝑦𝑎𝑤
𝐾 𝑃 𝑦𝑎𝑤
value found
testing
5.7
(underestimate)
19
(overestimate)
10 (Achieves good
control)
Table 1 – Proportional gain (𝐾 𝑃 𝑦𝑎𝑤) tuning: Comparison of
results from two analytical methods with experimental results

More Related Content

What's hot

GPU Computing: A brief overview
GPU Computing: A brief overviewGPU Computing: A brief overview
GPU Computing: A brief overview
Rajiv Kumar
 
Kmem 3116 al2 report (keb090001)
Kmem 3116 al2 report (keb090001)Kmem 3116 al2 report (keb090001)
Kmem 3116 al2 report (keb090001)
Aliff Azmi
 
Parallel computing with Gpu
Parallel computing with GpuParallel computing with Gpu
Parallel computing with Gpu
Rohit Khatana
 

What's hot (11)

Digitization of Cycle Time in Riveting Machine
Digitization of Cycle Time in Riveting MachineDigitization of Cycle Time in Riveting Machine
Digitization of Cycle Time in Riveting Machine
 
GPU Computing: A brief overview
GPU Computing: A brief overviewGPU Computing: A brief overview
GPU Computing: A brief overview
 
Cathode ray tube
Cathode ray tubeCathode ray tube
Cathode ray tube
 
Matlab
MatlabMatlab
Matlab
 
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
Out-of-core GPU Memory Management for MapReduce-based Large-scale Graph Proce...
 
Kmem 3116 al2 report (keb090001)
Kmem 3116 al2 report (keb090001)Kmem 3116 al2 report (keb090001)
Kmem 3116 al2 report (keb090001)
 
Whole Life Cost Modelling
Whole Life Cost ModellingWhole Life Cost Modelling
Whole Life Cost Modelling
 
Using TICK Stack For System and App Metrics
Using TICK Stack For System and App MetricsUsing TICK Stack For System and App Metrics
Using TICK Stack For System and App Metrics
 
Parallel computing with Gpu
Parallel computing with GpuParallel computing with Gpu
Parallel computing with Gpu
 
Tizimlar va signallarni qayta ishlash_8-ma'ruza (Systems and Digital Signal P...
Tizimlar va signallarni qayta ishlash_8-ma'ruza (Systems and Digital Signal P...Tizimlar va signallarni qayta ishlash_8-ma'ruza (Systems and Digital Signal P...
Tizimlar va signallarni qayta ishlash_8-ma'ruza (Systems and Digital Signal P...
 
Adaptive Computing Seminar - Suyog Potdar
Adaptive Computing Seminar - Suyog PotdarAdaptive Computing Seminar - Suyog Potdar
Adaptive Computing Seminar - Suyog Potdar
 

Similar to Poster 20-04-15 V3_port

An optical satellite tracking system for undergraduate research bruski, jon...
An optical satellite tracking system for undergraduate research   bruski, jon...An optical satellite tracking system for undergraduate research   bruski, jon...
An optical satellite tracking system for undergraduate research bruski, jon...
Srinivas Naidu
 
Responsibilities
ResponsibilitiesResponsibilities
Responsibilities
Bikash Jain
 
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
Panth Shah
 

Similar to Poster 20-04-15 V3_port (20)

2007-_01-3912
2007-_01-39122007-_01-3912
2007-_01-3912
 
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
 
IRJET- Self Driving RC Car using Behavioral Cloning
IRJET-  	  Self Driving RC Car using Behavioral CloningIRJET-  	  Self Driving RC Car using Behavioral Cloning
IRJET- Self Driving RC Car using Behavioral Cloning
 
INCAST_2008-014__2_
INCAST_2008-014__2_INCAST_2008-014__2_
INCAST_2008-014__2_
 
IRJET- Real Time Video Object Tracking using Motion Estimation
IRJET- Real Time Video Object Tracking using Motion EstimationIRJET- Real Time Video Object Tracking using Motion Estimation
IRJET- Real Time Video Object Tracking using Motion Estimation
 
An optical satellite tracking system for undergraduate research bruski, jon...
An optical satellite tracking system for undergraduate research   bruski, jon...An optical satellite tracking system for undergraduate research   bruski, jon...
An optical satellite tracking system for undergraduate research bruski, jon...
 
jai krishna.ppt.pptx
jai krishna.ppt.pptxjai krishna.ppt.pptx
jai krishna.ppt.pptx
 
Traffic Sign Recognition Model
Traffic Sign Recognition ModelTraffic Sign Recognition Model
Traffic Sign Recognition Model
 
Background differencing algorithm for moving object detection using system ge...
Background differencing algorithm for moving object detection using system ge...Background differencing algorithm for moving object detection using system ge...
Background differencing algorithm for moving object detection using system ge...
 
Resume marky20181025
Resume marky20181025Resume marky20181025
Resume marky20181025
 
IRJET- Deep Learning Framework Analysis
IRJET- Deep Learning Framework AnalysisIRJET- Deep Learning Framework Analysis
IRJET- Deep Learning Framework Analysis
 
ID_68-Cui.pptx
ID_68-Cui.pptxID_68-Cui.pptx
ID_68-Cui.pptx
 
Fabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling DroneFabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling Drone
 
IRJET - Wavelet based Image Fusion using FPGA for Biomedical Application
 IRJET - Wavelet based Image Fusion using FPGA for Biomedical Application IRJET - Wavelet based Image Fusion using FPGA for Biomedical Application
IRJET - Wavelet based Image Fusion using FPGA for Biomedical Application
 
Responsibilities
ResponsibilitiesResponsibilities
Responsibilities
 
Automated traffic sign board
Automated traffic sign boardAutomated traffic sign board
Automated traffic sign board
 
IRJET- Border Security using Computer Vision
IRJET- Border Security using Computer VisionIRJET- Border Security using Computer Vision
IRJET- Border Security using Computer Vision
 
Development of portable automatic number plate recognition (ANPR) system on R...
Development of portable automatic number plate recognition (ANPR) system on R...Development of portable automatic number plate recognition (ANPR) system on R...
Development of portable automatic number plate recognition (ANPR) system on R...
 
A Video Processing based System for Counting Vehicles
A Video Processing based System for Counting VehiclesA Video Processing based System for Counting Vehicles
A Video Processing based System for Counting Vehicles
 
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
Interfacing of MATLAB with Arduino for Object Detection Algorithm Implementat...
 

Poster 20-04-15 V3_port

  • 1. Visionᅠ&ᅠMotion – The frame throughput is improved by leveraging hardware video encoding to calculate optic flow using H.264. Motion data is segmented into foreground and background classes (Figure 4) to provide an additional input for to the sensor fusion system. Controlᅠ&ᅠTuning – In autonomous flight, Pl controllers are used to keep the heading, altitude and speed of the blimp constant relative to the target. The controller step responses were tuned during test flights (Table 1). Conclusions – This blimp provides a hardware and software platform to demonstrate autonomous station keeping behaviour without relying on additional ground-based equipment. Future work will now refine the vision system and improve control systems reliability. Untethered Autonomous Flight of an Indoor Blimp Andrew Mathieson Supervisor: Toby Breckon School of Engineering & Computer Sciences, Durham University Abstract – A prototype autonomous airship was developed and tested indoors (Figure 1). Until now, autonomous blimps have utilised additional equipment on the ground to calculate their position. This project instead uses visual and inertia sensing to perform all processing on board. Hardware – Low-cost commercial hardware components were integrated into the blimp payload. The key objectives for component selection were weight minimisation and software availability. The hardware comprises an Inertial Measurement Unit (IMU), Pulse Width Modulated (PWM) servo outputs and optically isolated motor drivers (Figure 2). Software – Estimates blimp pose relative to a chessboard target using the camera, and fuses camera pose estimates with data from the IMU. Software is divided into four concurrent threads (Figure 3) and runs on a Raspberry Pi 2. MEng Research & Development Project April 2015 Figure 1 – Blimp in flight Figure 2 – Hardware components Figure 3 – Software architecture Figure 4 – Captured frame and HSV representation of motion Zeigler-Nichols predicted 𝐾 𝑃 𝑦𝑎𝑤 Harriot/Nyquist predicted 𝐾 𝑃 𝑦𝑎𝑤 𝐾 𝑃 𝑦𝑎𝑤 value found testing 5.7 (underestimate) 19 (overestimate) 10 (Achieves good control) Table 1 – Proportional gain (𝐾 𝑃 𝑦𝑎𝑤) tuning: Comparison of results from two analytical methods with experimental results