summary_2.ppt
Upcoming SlideShare
Loading in...5
×
 

summary_2.ppt

on

  • 877 views

 

Statistics

Views

Total Views
877
Views on SlideShare
877
Embed Views
0

Actions

Likes
0
Downloads
17
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    summary_2.ppt summary_2.ppt Presentation Transcript

    • Research on Vision Systems for Small Unmanned VTOL Vehicles K. P. Valavanis, M.Kontitsis, R.Garcia
    • A UAV Vision System for Airborne Surveillance M.Kontitsis, K. Valavanis Technical University of Crete University of South Florida N. Tsourveloudis
    • Objectives
      • Present a methodology for the design of a machine vision system for aerial surveillance by Unmanned Aerial Vehicles (UAVs)
          • Identify specified thermal source
          • Perform these functions on board the UAV in Real time
          • Flexible enough to be used in a variety of applications
    • Machine Vision System IR/NIR image Noise reduction Feature extraction (Size, Mean intensity) Feature vectors classification Alarm on/off Persistence
    • Input Images
      • IR (3 μ m ~ 14 μ m)
      • 8bit grayscale
      • Near IR camera (1 μ m ~ 3 μ m)
    • Noise Reduction
      • 5x5 spatial Gaussian filter
      + Smoothes noise while preserving most of the features on the image
    • Feature Extraction
      • Size of region using a region growing algorithm
      • Mean intensity of region defined as
      This module attempts to extract information about the regions on the image
    • Feature Vector Classification Subsystem Mean Intensity of Region Size of Region Target Identification Possibility Fuzzy Classifier
    • Mean Intensity Membership Functions Grayscale values High Mid Low
    • Region Size Membership Functions # of Pixels Small Medium Large
    • Objective ID Possibility Membership Functions
    • Output of the Fuzzy Classifier Mean Intensity Size (pixels) Possibility
    • Classification Example
    • Classification Result p>0.8 0.5<p<0.8 p<0.5
    • Alarm raising
      • Persistent classification of a certain region as of High Possibility raises the alarm
      • The region that raised the alarm is pin-pointed by a red cross
      • The alarm stays on even if the thermal source is temporarily occluded by surroundings or lost due to violent camera vibration
    • Alarm raising
      • Mechanism used : Alarm Registry
      • If p i > Ton => Activate alarm
      • If p i < Toff => Deactivate alarm
      p 2 i 2, j 2 …… .. … . p n i n, j n p 1 i 1, j 1 Variable persistance Region Coordinates
    • Complexity
      • Noise Reduction  O(n 2 ) for (nxn) image
      • Region Growing  O(n 2 ) for (nxn) image
      • Fuzzy Logic Classifier*  O(nxm)
      • * in its current implementation
      • n inputs, m rules
    • Case Study: Forest fires Adjusting membership functions manually
    • Classification Example Thermal source (fire) objective present
    • Classification Result possibility>0.7 0.5< possibility <0.7 possibility <0.5 objective present
    • Classification Example objective absent
    • Classification Result possibility>0.7 0.5< possibility <0.7 possibility <0.5 objective absent
    • Classification Result (Video) (objective present)
    • Classification Result (Video) (objective absent)
    • Automatic Parameter Selection
      • a ij b ij c ij d ij for i =1 and j =1,2,3 which define the form of the membership functions of Mean Intensity
      a ij b ij c ij d ij
    • Basic Elements of the Genetic Algorithm
      • Chromosome => parameters x=(aij bij cij dij)
      • Fitness function
      fitness ( x )=1 correct deactivation of the alarm fitness ( x )= 0 in any other case correct activation of the alarm
    • Basic Elements of the Genetic Algorithm
      • Selection operator  selects individuals for mating as many times as the ratio of their fitness to the total fitness of the population
      • Crossover operator  crossover probability p c =0.7
      • Mutation operator  mutation probability p m =0.001
    • Mean Intensity M.F. as evolved by GA
    • Result (using GA for parameter selection)
    • Remarks
      • Adjustable for a variety of applications
      • Real time execution
      • Correct identification rate of about 90%
      • False alarms not entirely avoided (especially in the system evolved by the GA)
    • Design, Implementation and Testing of a Vision System for Small Unmanned VTOL Vehicles K. P. Valavanis, M.Kontitsis, R.Garcia
    • Aim of the work
      • To explore the design alternatives in the attempt to implement a functional vision system for a small Unmanned VTOL.
      • Two different approaches examined:
        • On board processing
        • On the ground processing
    • Limitations
      • Weight Limitations
      • Power Supply Limitations
      • Processing power issues
      • Communications
    • Ground Control Station (GCS ) UAV / VTOL Data link (telemetry + video)
      • Sensing (camera)
      • Transmitting data to GCS
      • Map Building
      • Target Identification
      • Command Issuing
      Commands Centralized approach (processing wise) Processing is left to the PC on the Ground Control Station
    • Ground Control Station (GCS ) UAV / VTOL Data link (telemetry + video)
      • Sensing (camera)
      • Map Building
      • Target Identification
      • Transmitting data and alarm signal to GCS
      • Command Issuing
      Commands De-centralized approach (processing wise) Processing is carried out locally on the PC onboard the UAV / VTOL
    • General trends in the area Rmax by Yamaha On-board No details provided Linkoping University, Sweden (WITAS) [25] Vario Robinson R22 On-the-ground Stereo vision, Sobel egde detector Southern Polytechnic State Univesity [14] Bergen Twin On-board Omnidirectional, optic flow USC [3] [5] Rmax by Yamaha On-the-ground Template matching and RGB color Carnegie Mellon University [24] Huner Technik On-board integrated in camera No details provided Swiss Federal Institute of Technology (ETH) [23] XCell .60 On-the-ground Edge linking matching University of Texas [13] MARVIN by SSM Technik On-the-ground No details provided IT Berlin [15] Bergen Twin On-board Template comparison Rose Hulman IT (RHIT) [22] Black Star by TSK On-the-ground Template matching MIT [21] Hummingbird Aerospace Robotic Laboratory at Standford On-the-ground YUV color segmentation, signum of Laplacian of Gaussian (sLoG) Standford University [10] [12] Rmax by Yamaha On- board Edge detectors, morphing, statistical pattern matching Georgia Tech [18] [19] BEAR No details provided No details provided Berkeley University [6] Vehicle Processing unit Machine Vision techniques used Institution
    • Functionality and characteristics X X X X Template matching X IMU data X X X X Motion estimation X X X Optic flow Methods used X X X Object tracking X X X X Object identification X X 3D reconstruction / depth mapping Capabilities X Calibrated cameras X Natural landmarks X X X Known landmarks X X X Static / man-made environment X X Dynamic environment X X X X X X Dynamic observer Experimental setup CNRS ~ [27] WITAS + [25] COMETS * [26] Univ. of South California Georgia Tech Berkeley University Institution 
    • Processing on the Ground
    • Hardware configuration (on the ground processing)
    • Vision algorithm overview
    • Experimental Results
    • “Mine” detection results
    • “Mine” detection results 2
    • “Mine” detection results 3
    • “Mine” detection results 4
    • Processing on board the VTOL
    • Raptor 90 with on board vision system.
    • On board processing Firewire camera Onboard PC Wireless 802.11b Wireless 802.11b Ground Computer Used for processing Used for monitoring
    • On board system
    • Onboard system architecture (hardware)
    • On board system
      • 1.2 GHz EPIA Processor
      • Via Embedded motherboard
      • Unibrain Firewire Camera
      • 1 Gig 266 MHz RAM
      • 1 Gig Compact Flash
      • Compact Flash to IDE adapter
      • Motorola M12+ GPS Receiver
      • 8 Channel Servo Controller
      • 200 W Power Supply
      • 14.8 V LiPo Battery
      • 12 V Voltage Regulator
      • 802.11B Cardbus
    • Key Hardware Components
      • Mini-ATX motherboard
        • Low weight
        • Small size
      • Unibrain Firewire camera
        • Lightweight (60g)
        • Built-in Firewire interface
      • 1 Gig Compact Flash
        • Substitutes the vibration sensitive hard-drive
      • Lithium Polymer (LiPo) batteries
        • High amperage output for it’s size
    • Software Details
      • Linux operating system (Slackware v10)
      • Open source libraries libdv , libraw1394 , libavc1394 , libdc1394 used for Firewire access
      • Vision code written in C-language for speed
      • Minor Software Enhancements and Experimental Results
    • Detection of more than one objects
      • Using the same vision algorithm
      • Employing Regions of Interest (ROIs) on the image to separate objects
        • Byproduct  execution speedup (x2 in average) since only the pixels in the ROIs are processed over every frame.
          • Every X (typically 5  10) frames the algorithm searches the whole image for new objects.
        • Regions can be tracked using the pan/tilt to keep the object inside the frame while the VTOL moves
    • Objects of interest are enclosed in a rectangle Detection of more than one objects
    • Detection of more than one objects Objects of interest are enclosed in a rectangle False alarm
    • Detection of more than one objects Objects of interest Not well positioned rectangle
    • Communication Issues (for the on-board system)
    • Communication channels on the VTOL On board computer IMU 802.11b/g GPS Pan/tilt servos Control servos autonomous operation
    • Communication channels on the VTOL On board computer IMU 802.11b/g GPS Pan/tilt servos Control servos Critical channels autonomous operation
    • Data going into the PC
      • Uncompressed images at 30 frames / sec (critical for object recognition/navigation)
      • Inertial Measurement Unit (IMU) data 4 to 10 Hz (critical for navigation)
      • GPS data (critical for navigation)
    • Data coming out of the PC
      • For monitoring purposes (not critical)
        • Compressed images at 30 frames / sec
        • IMU data 4 to 10 Hz
        • GPS data
      • Critical to the operation of the VTOL
        • Commands to servo-boards
        • Object identification alarms
    • Bandwidth requirements for input channels
      • 640x480 images at 30 frames / sec  approx 220Mbps
        • Firewire link IEEE 1394 (400Mbps)
      • IMU data at 10 Hz  approx 6kbps
        • Serial RS-232
      • GPS data at 4 Hz  < 1kbps
        • Serial RS-232
    • Bandwidth requirements for output channels
      • Commands to servos  < 1 kbps
        • Serial RS-232
      • Telemetry data and compressed video at 30 fps  approx 1.5 to 4 Mbps (depending on image quality)
        • 802.11 b/g (11/54 Mbps)
    • Bandwidth issues of the 802.11
      • Video data flood the wireless channel
      • Bandwidth decreases with range
        • As a result the video displayed on the ground station at less than 30 fps.
    • Analog video channel
      • Substitutes the digital firewire (IEEE 1394)
      • Delivers 30 fps regardless of range
      • Independent from the onboard PC
      • Frees bandwidth of the 802.11 to be used for other purposes
        • Frame grabber required for digitization in order for the PC to process the images.
    • Communication channels on the VTOL On board computer IMU 802.11b/g GPS Pan/tilt servos Control servos (alternative design) RF Transmitter (900MHz)
    • Communication channels on the VTOL On board computer IMU GPS Pan/tilt servos Control servos semi-autonomous operation 802.11b/g Critical channels RF Transceiver (72MHz)
    • EM compatibility
      • The onboard digital channels exhibited no compatibility problems
      • Theoretically the RF channels are very well separated in frequency but….
      • The RF are still vulnerable to interference
    • Security issues
      • Both 802.11 and RF channels can be made secure using encryption
      • Easier done on the 802.11
    • Conclusions
      • Real time processing rate achieved
      • On board system preferred because it promotes autonomy
    • Future work
      • Motion estimation
      • Structure from motion
      • Expand the feature space (size,color  texture,shape etc)
      • Visual Simultaneous Localization And Mapping
      • Quantify relationship between weight-power-algorithmic complexity
      • Optimization of vision routines
      • Use of a better processor
    • Application on real-time traffic data extraction
    • Aim of the work
      • Design a vision algorithm to run on a small VTOL capable of extracting real time traffic data from video.
      • The data will be used as input to traffic simulation models
    • Algorithm overview Stabilization Motion Extraction Feature Extraction Feature Grouping Vehicle Tracking IMU & GPS data Environment Setup Selection Traffic Statistics Images from camera
    • Motion estimation
      • Motion is extracted by differencing two consecutive frames
      ( - ) ( = )
    • Feature extraction and grouping
      • A morphological operator (dilation) is used on the image of differences to group together scattered pixels of an object
      (dilation x2)
    • Grouping (continued)
      • The extracted regions are enclosed in Minimum Bounding Rectangles (MBRs)
    • Selecting “counting zones”
      • Regions of the image are selected as counting zones.
      • Cars are counted as they enter and leave them.
      • Shaded areas mark the counting zones.
      • Colors are used to differentiate between them.
    • Sample result Relative traffic load per region Input with overlaid region markers
    • Output Data
      • Algorithm can provide the following data:
        • # of cars on a specific link at any point in time
          • Assuming that a link is sufficiently small to fit in the camera’s field of view. (multiple VTOLs needed to cover a significant area)
        • Average flow per link
    • Ongoing work
      • Automatic selection and placement of “counting zones”
      • Create tables of data suitable for the simulation software
      • New input data available (show video)