SlideShare a Scribd company logo
Object Distance Detection using a Joint Transform
Correlator
Alexander Layton
Dept. of Computer Science
University of Illinois at Urbana-Champaign
Urbana, IL
Dr. Ronald Marsh
Dept. of Computer Science
University of North Dakota
Grand Forks, ND
Abstract—Computer stereo vision makes heavy use of object
distance detection. The primary method to detect object distance
is to compare two images of the same scene taken from different
vantage points. The necessity of comparing two images naturally
leads us to investigate optical correlators.
Since the Fourier transform, on which optical correlators are
based, is lossless, we suppose that distance information encoded in
a stereo image pair is preserved through the correlation process.
We then try to recover that distance by investigating the location
of the correlation peaks.
Initial data indicates that we may plausibly extract distance
information from a correlation result. However, this data was
gathered under very specific and controlled conditions, and
further research is necessary to derive a more general result.
Keywords—computer vision; stereo vision; optics; optical
correlation; joint transform correlator; distance detection
I. INTRODUCTION
A. Optical Correlators
An optical correlator is a device for comparing two images
using Fourier transforms. More specifically, an optical
correlator takes in two input images and outputs their cross-
correlation. We may think of the correlation result as locating
one image inside the other. If the two input images are
sufficiently similar, we will see a bright spot in the correlation
result, hereby referred to as a peak. The location of the peak
indicates the location of one image within the other.
A. VanderLugt developed the first successful optical
correlator, the matched filter correlator, in 1963 [1]. The
matched filter correlator (MFC) pioneered development of
optical correlators and is still in use today. However, the MFC
design requires specialized hardware and is highly sensitive to
the alignment of its instruments.
The MFC was originally designed to locate a particular
image, known as the “reference” or “filter” image, inside many
other images, the “target” or “input” images. As such, the
reference and target image are treated differently within the
correlator, and the MFC is best suited to such asymmetrical
applications.
To overcome the limitations of the matched filter correlator,
Weaver and Goodman invented the joint transform correlator
(JTC) in 1966 [2]. The JTC is much less sensitive to instrument
alignment but is less space-efficient. Additionally, both input
images in a JTC undergo the same transformations, without
regard for a “target” or “reference” image. Thus, the JTC is
better suited to applications without preferential treatment of
either input image. (It should be noted, however, that while the
JTC does not discriminate between input images, the
correlation result depends on each image’s position in the focal
plane. Thus we will not obtain the same result if we swap our
two input images.)
Optical correlation need not be done optically anymore; the
same process can be performed programmatically [3]. Though
the correlation is no longer real-time, one has the opportunity
to post-analyze the correlation result.
B. Application to Object Distance Detection
The root of computer stereo vision is using two or more 2D
images to reproduce a 3D scene. In particular, a computer
detects the distance to an object by measuring the shift in the
object’s 2D location from one vantage point to another [4].
Comparing two images of the same object leads us naturally
to explore optical correlators. Since the Fourier transform is
lossless, we posit that any distance information encoded in the
original images will be preserved in their correlation. In effect,
the correlator automates the process of finding a common pixel
in the algorithm outlined in [5]. Since we use a new pair of
images for each distance measurement, the joint transform
correlator is the appropriate design.
This experiment is entirely exploratory in nature; we are
only concerned with the plausibility of applying optical
correlation to object distance detection, not the practicality. We
hope, however, that this research may find applications in
space, where the background is more or less static.
Nevertheless, to the best of our knowledge, the use of a JTC to
determine distance to an object is a novel approach.
II. EXPERIMENT
A. Setup
Images were generated by two Microsoft LifeCam Cinema
webcams aligned horizontally with 9.5” baseline separation
between cameras (Fig. 1). LifeCam Cinema webcams have a
73° field of view and an autofocusing lens.
This research was made possible by the National Science Foundation,
Award #1359224, with support from the Department of Defense.
Fig. 1. Microsoft LifeCam Cinema webcams used for data collection.
Fig. 2. Post-It note on wall from 5ft away.
The subject was a 2”x1.5” Post-It note on a wall (Fig. 2).
We photograph a wall to minimize the confounding effect of
the background on the correlation. To this end, we make sure
the wall is as evenly lit as possible. The Post-It note was chosen
to highly contrast the wall, making it easier for the JTC to
identify the object.
B. Procedure
We measured the distance to the wall with a tape measure,
then took a picture with each camera. We used ImageMagick to
crop each image and then extract the value channel, producing
a black-and-white square image for the correlator. Each input
image served once as the “target” and once as the “filter.” The
correlation program also attempted to detect correlation peaks
using an algorithm developed by Dr. Marsh. The actual distance
and peak data were exported into an Excel spreadsheet.
The cameras would frequently fail to focus on the wall; this
led to blurry input images that would not produce a strong
correlation peak. These correlation results were thrown out.
Frequently, a peak would be plainly visible but not strong
enough for the algorithm to detect it. For example, a peak that
occupies two pixels in the output image would slip through the
detector since neither pixel has sufficient contrast with its
neighbor, despite being clearly enough resolved to be a useful
data point. In these cases, the peaks locations were entered
manually.
III. DATA
Each stereo pair of input images (Fig. 3) produced a pair of
correlation results (Fig. 4).
Fig. 3. A pair of input images.
Fig. 4. A pair of correlation results. The peak is the cross-shaped mark.
Fig. 5. Coordinates of the peak vs. distance from wall.
We plot the location of the peak versus the distance to the
wall to find a clear relation (Fig. 5). Coordinates of the peak are
measured from center, since distance to the wall should not
depend on the size of the image. Note that we take the absolute
value of the coordinates since swapping the input images
negates the peak’s coordinates.
A. Wrapping and Flipping Effect
As distance to the object decreases, the X coordinate of the
peak increases, up to a maximum of half the image size. With
our testing environment, this occurs at around 1.5 feet. If actual
distance to the object decreases beyond that, the peak will wrap
around the image and then get both coordinates flipped:
0
50
100
150
0 2 4 6 8 10 12
|PeakX,Y|
Distance
Distance vs. |Peak X,Y|
|Peak X| |Peak Y|
Fig. 6. Effects of an off-image peak.
B. Regression
If we restrict our analysis to distances that do not produce
the above “flipping” effect, we may invert the data and a
regression becomes quite obvious (Fig. 7). Excel gives the
following logarithmic model:
. | | .
It is important to note that this formula was generated under
the specific conditions, including a 73° field of view and a 9.5
in baseline, but it is expected to lay the groundwork for a more
general formula. With this formula we were able to determine
distances from 2 ft to 5 ft, with an accuracy of ±3 in.
Fig. 7. Absolute value of peak x vs distance.
IV. CONCLUSIONS
The work to date achieved its goal of proving that distance
can be recovered from a joint transform correlation peak. While
this work shows promise, further work is needed to elucidate and
generalize the relationship between peak location and distance.
This will move us closer to the ultimate goal of finding a formula
relating Peak X, Peak Y, field of view, baseline, and distance.
The wrapping and flipping effect presents a formidable
challenge to this goal, and will likely be the next target of
investigation. Another obvious direction future research will
take is to test the effect of varying baseline on the location of the
correlation peak.
Finally, at this time the method has not been tested for
practical applications like the CubeSat platform (for which this
research was originally conceived). The work to date and near-
future work is entirely proof of concept, and we may expect that
applications follow once a strong foundation is laid.
ACKNOWLEDGMENT
This material is based upon work supported by the National
Science Foundation Research Experience for Undergraduates
under Grant No. (NSF 1359244).
REFERENCES
[1] A. Vander Lugt, "Signal detection by complex spatial filtering," IEEE
Transactions on Information Theory, vol. 10, pp. 139-145. 1964.
[2] C. S. Weaver and J. W. Goodman, "A Technique for Optically
Convolving Two Functions," Applied Optics vol. 5, pp. 1248-1249. 1966.
[3] A. J. Barry and R. Tedrake, "Pushbroom stereo for high-speed navigation
in cluttered environments," Robotics and Automation (ICRA), 2015 IEEE
International Conference on, Seattle, WA, 2015, pp. 3046-3052.
[4] Jernej Mrovlje and Damir Vrančić, “Distance measuring based on
stereoscopic pictures,” 9th International PhD Workshop on Systems and
Control: Young Generation Viewpoint. 2008.
[5] E. Tjandranegara, “Distance Estimation Algorithm for Stereo Pair
Images,” Purdue ECE Tech. Rep., West Lafayette, IN, Rep. 64, 2005.
[6] R. Mandelbaum, L. McDowell, L. Bogoni, B. Reich and M. Hansen,
"Real-time stereo processing, obstacle detection, and terrain estimation
from vehicle-mounted stereo cameras," Applications of Computer Vision,
1998. WACV '98. Proceedings., Fourth IEEE Workshop on, Princeton,
NJ, 1998, pp. 288-289.
y = -2.059ln(x) + 11.529
R² = 0.9812
0
2
4
6
8
10
12
0 20 40 60 80 100 120
Distance
|Peak X|
|Peak X| vs. Distance

More Related Content

What's hot

IRJET- Moving Object Detection using Foreground Detection for Video Surveil...
IRJET- 	 Moving Object Detection using Foreground Detection for Video Surveil...IRJET- 	 Moving Object Detection using Foreground Detection for Video Surveil...
IRJET- Moving Object Detection using Foreground Detection for Video Surveil...
IRJET Journal
 
A new approach of edge detection in sar images using
A new approach of edge detection in sar images usingA new approach of edge detection in sar images using
A new approach of edge detection in sar images using
eSAT Publishing House
 
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISEAUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
ijcsa
 
D04432528
D04432528D04432528
D04432528
IOSR-JEN
 
Video Stitching using Improved RANSAC and SIFT
Video Stitching using Improved RANSAC and SIFTVideo Stitching using Improved RANSAC and SIFT
Video Stitching using Improved RANSAC and SIFT
IRJET Journal
 
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
sipij
 
Remotely sensed image segmentation using multiphase level set acm
Remotely sensed image segmentation using multiphase level set acmRemotely sensed image segmentation using multiphase level set acm
Remotely sensed image segmentation using multiphase level set acm
Kriti Bajpai
 
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
CSCJournals
 
A Novel Approach for Tracking with Implicit Video Shot Detection
A Novel Approach for Tracking with Implicit Video Shot DetectionA Novel Approach for Tracking with Implicit Video Shot Detection
A Novel Approach for Tracking with Implicit Video Shot Detection
IOSR Journals
 
G04743943
G04743943G04743943
G04743943
IOSR-JEN
 
motion and feature based person tracking in survillance videos
motion and feature based person tracking in survillance videosmotion and feature based person tracking in survillance videos
motion and feature based person tracking in survillance videos
shiva kumar cheruku
 
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
ijp2p
 
3680-NoCA.pptx
3680-NoCA.pptx3680-NoCA.pptx
3680-NoCA.pptxgrssieee
 
Analog signal processing solution
Analog signal processing solutionAnalog signal processing solution
Analog signal processing solution
csandit
 
Haze removal for a single remote sensing image based on deformed haze imaging...
Haze removal for a single remote sensing image based on deformed haze imaging...Haze removal for a single remote sensing image based on deformed haze imaging...
Haze removal for a single remote sensing image based on deformed haze imaging...
LogicMindtech Nologies
 
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
ijcseit
 
Object tracking
Object trackingObject tracking
Object tracking
Sri vidhya k
 
isvc_draft6_final_1_harvey_mudd (1)
isvc_draft6_final_1_harvey_mudd (1)isvc_draft6_final_1_harvey_mudd (1)
isvc_draft6_final_1_harvey_mudd (1)David Tenorio
 

What's hot (20)

IRJET- Moving Object Detection using Foreground Detection for Video Surveil...
IRJET- 	 Moving Object Detection using Foreground Detection for Video Surveil...IRJET- 	 Moving Object Detection using Foreground Detection for Video Surveil...
IRJET- Moving Object Detection using Foreground Detection for Video Surveil...
 
A new approach of edge detection in sar images using
A new approach of edge detection in sar images usingA new approach of edge detection in sar images using
A new approach of edge detection in sar images using
 
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISEAUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
AUTOMATED IMAGE MOSAICING SYSTEM WITH ANALYSIS OVER VARIOUS IMAGE NOISE
 
D04432528
D04432528D04432528
D04432528
 
Video Stitching using Improved RANSAC and SIFT
Video Stitching using Improved RANSAC and SIFTVideo Stitching using Improved RANSAC and SIFT
Video Stitching using Improved RANSAC and SIFT
 
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
DEEP LEARNING BASED TARGET TRACKING AND CLASSIFICATION DIRECTLY IN COMPRESSIV...
 
Remotely sensed image segmentation using multiphase level set acm
Remotely sensed image segmentation using multiphase level set acmRemotely sensed image segmentation using multiphase level set acm
Remotely sensed image segmentation using multiphase level set acm
 
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
 
A Novel Approach for Tracking with Implicit Video Shot Detection
A Novel Approach for Tracking with Implicit Video Shot DetectionA Novel Approach for Tracking with Implicit Video Shot Detection
A Novel Approach for Tracking with Implicit Video Shot Detection
 
G04743943
G04743943G04743943
G04743943
 
motion and feature based person tracking in survillance videos
motion and feature based person tracking in survillance videosmotion and feature based person tracking in survillance videos
motion and feature based person tracking in survillance videos
 
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
 
3680-NoCA.pptx
3680-NoCA.pptx3680-NoCA.pptx
3680-NoCA.pptx
 
Analog signal processing solution
Analog signal processing solutionAnalog signal processing solution
Analog signal processing solution
 
Haze removal for a single remote sensing image based on deformed haze imaging...
Haze removal for a single remote sensing image based on deformed haze imaging...Haze removal for a single remote sensing image based on deformed haze imaging...
Haze removal for a single remote sensing image based on deformed haze imaging...
 
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
AUTOMATIC IDENTIFICATION OF CLOUD COVER REGIONS USING SURF
 
Object tracking
Object trackingObject tracking
Object tracking
 
40120140503006
4012014050300640120140503006
40120140503006
 
Final Review
Final ReviewFinal Review
Final Review
 
isvc_draft6_final_1_harvey_mudd (1)
isvc_draft6_final_1_harvey_mudd (1)isvc_draft6_final_1_harvey_mudd (1)
isvc_draft6_final_1_harvey_mudd (1)
 

Viewers also liked

Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
eSAT Publishing House
 
Project Report Distance measurement system
Project Report Distance measurement systemProject Report Distance measurement system
Project Report Distance measurement system
kurkute1994
 
Distance Measurement by Ultrasonic Sensor
Distance Measurement by Ultrasonic SensorDistance Measurement by Ultrasonic Sensor
Distance Measurement by Ultrasonic Sensor
Edgefxkits & Solutions
 
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robotB.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
Sushant Shankar
 
Ultrasonic based distance measurement system
Ultrasonic based distance measurement systemUltrasonic based distance measurement system
Ultrasonic based distance measurement system
Mrinal Sharma
 
State of the Word 2011
State of the Word 2011State of the Word 2011
State of the Word 2011
photomatt
 

Viewers also liked (7)

Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontrollerHand gesture recognition using ultrasonic sensor and atmega128 microcontroller
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
 
Project Report Distance measurement system
Project Report Distance measurement systemProject Report Distance measurement system
Project Report Distance measurement system
 
Distance Measurement by Ultrasonic Sensor
Distance Measurement by Ultrasonic SensorDistance Measurement by Ultrasonic Sensor
Distance Measurement by Ultrasonic Sensor
 
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robotB.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
B.Tech.Final Year ECE Project Report on Ultrasonic distance measure robot
 
Ultrasonic based distance measurement system
Ultrasonic based distance measurement systemUltrasonic based distance measurement system
Ultrasonic based distance measurement system
 
State of the Word 2011
State of the Word 2011State of the Word 2011
State of the Word 2011
 
Slideshare ppt
Slideshare pptSlideshare ppt
Slideshare ppt
 

Similar to Object Distance Detection using a Joint Transform Correlator

Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
Tomohiro Fukuda
 
Efficient 3D stereo vision stabilization for multi-camera viewpoints
Efficient 3D stereo vision stabilization for multi-camera viewpointsEfficient 3D stereo vision stabilization for multi-camera viewpoints
Efficient 3D stereo vision stabilization for multi-camera viewpoints
journalBEEI
 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing Approaches
IRJET Journal
 
Detection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite ImagesDetection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite Images
idescitation
 
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
IRJET Journal
 
IJARCCE 22
IJARCCE 22IJARCCE 22
IJARCCE 22Prasad K
 
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
ijp2p
 
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOSSHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
csandit
 
Satellite Imaging System
Satellite Imaging SystemSatellite Imaging System
Satellite Imaging System
CSCJournals
 
I0343065072
I0343065072I0343065072
I0343065072
ijceronline
 
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET Journal
 
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
Youness Lahdili
 
Design and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of viewDesign and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of view
sipij
 
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
CSCJournals
 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
IJCI JOURNAL
 
120_SEM_Special_Topics.ppt
120_SEM_Special_Topics.ppt120_SEM_Special_Topics.ppt
120_SEM_Special_Topics.ppt
zaki194502
 
V.karthikeyan published article a..a
V.karthikeyan published article a..aV.karthikeyan published article a..a
V.karthikeyan published article a..aKARTHIKEYAN V
 
V.KARTHIKEYAN PUBLISHED ARTICLE
V.KARTHIKEYAN PUBLISHED ARTICLEV.KARTHIKEYAN PUBLISHED ARTICLE
V.KARTHIKEYAN PUBLISHED ARTICLE
KARTHIKEYAN V
 
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM ClusteringAn Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
Editor IJCATR
 

Similar to Object Distance Detection using a Joint Transform Correlator (20)

Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
Visual Environment by Semantic Segmentation Using Deep Learning: A Prototype ...
 
Efficient 3D stereo vision stabilization for multi-camera viewpoints
Efficient 3D stereo vision stabilization for multi-camera viewpointsEfficient 3D stereo vision stabilization for multi-camera viewpoints
Efficient 3D stereo vision stabilization for multi-camera viewpoints
 
A Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing ApproachesA Survey on Single Image Dehazing Approaches
A Survey on Single Image Dehazing Approaches
 
Detection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite ImagesDetection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite Images
 
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
 
IJARCCE 22
IJARCCE 22IJARCCE 22
IJARCCE 22
 
ei2106-submit-opt-415
ei2106-submit-opt-415ei2106-submit-opt-415
ei2106-submit-opt-415
 
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
AN ADAPTIVE MESH METHOD FOR OBJECT TRACKING
 
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOSSHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
SHARP OR BLUR: A FAST NO-REFERENCE QUALITY METRIC FOR REALISTIC PHOTOS
 
Satellite Imaging System
Satellite Imaging SystemSatellite Imaging System
Satellite Imaging System
 
I0343065072
I0343065072I0343065072
I0343065072
 
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
IRJET- Digital Image Forgery Detection using Local Binary Patterns (LBP) and ...
 
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
6 - Conception of an Autonomous UAV using Stereo Vision (presented in an Indo...
 
Design and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of viewDesign and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of view
 
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
 
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSIONINFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
INFORMATION SATURATION IN MULTISPECTRAL PIXEL LEVEL IMAGE FUSION
 
120_SEM_Special_Topics.ppt
120_SEM_Special_Topics.ppt120_SEM_Special_Topics.ppt
120_SEM_Special_Topics.ppt
 
V.karthikeyan published article a..a
V.karthikeyan published article a..aV.karthikeyan published article a..a
V.karthikeyan published article a..a
 
V.KARTHIKEYAN PUBLISHED ARTICLE
V.KARTHIKEYAN PUBLISHED ARTICLEV.KARTHIKEYAN PUBLISHED ARTICLE
V.KARTHIKEYAN PUBLISHED ARTICLE
 
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM ClusteringAn Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
An Unsupervised Change Detection in Satellite IMAGES Using MRFFCM Clustering
 

Object Distance Detection using a Joint Transform Correlator

  • 1. Object Distance Detection using a Joint Transform Correlator Alexander Layton Dept. of Computer Science University of Illinois at Urbana-Champaign Urbana, IL Dr. Ronald Marsh Dept. of Computer Science University of North Dakota Grand Forks, ND Abstract—Computer stereo vision makes heavy use of object distance detection. The primary method to detect object distance is to compare two images of the same scene taken from different vantage points. The necessity of comparing two images naturally leads us to investigate optical correlators. Since the Fourier transform, on which optical correlators are based, is lossless, we suppose that distance information encoded in a stereo image pair is preserved through the correlation process. We then try to recover that distance by investigating the location of the correlation peaks. Initial data indicates that we may plausibly extract distance information from a correlation result. However, this data was gathered under very specific and controlled conditions, and further research is necessary to derive a more general result. Keywords—computer vision; stereo vision; optics; optical correlation; joint transform correlator; distance detection I. INTRODUCTION A. Optical Correlators An optical correlator is a device for comparing two images using Fourier transforms. More specifically, an optical correlator takes in two input images and outputs their cross- correlation. We may think of the correlation result as locating one image inside the other. If the two input images are sufficiently similar, we will see a bright spot in the correlation result, hereby referred to as a peak. The location of the peak indicates the location of one image within the other. A. VanderLugt developed the first successful optical correlator, the matched filter correlator, in 1963 [1]. The matched filter correlator (MFC) pioneered development of optical correlators and is still in use today. However, the MFC design requires specialized hardware and is highly sensitive to the alignment of its instruments. The MFC was originally designed to locate a particular image, known as the “reference” or “filter” image, inside many other images, the “target” or “input” images. As such, the reference and target image are treated differently within the correlator, and the MFC is best suited to such asymmetrical applications. To overcome the limitations of the matched filter correlator, Weaver and Goodman invented the joint transform correlator (JTC) in 1966 [2]. The JTC is much less sensitive to instrument alignment but is less space-efficient. Additionally, both input images in a JTC undergo the same transformations, without regard for a “target” or “reference” image. Thus, the JTC is better suited to applications without preferential treatment of either input image. (It should be noted, however, that while the JTC does not discriminate between input images, the correlation result depends on each image’s position in the focal plane. Thus we will not obtain the same result if we swap our two input images.) Optical correlation need not be done optically anymore; the same process can be performed programmatically [3]. Though the correlation is no longer real-time, one has the opportunity to post-analyze the correlation result. B. Application to Object Distance Detection The root of computer stereo vision is using two or more 2D images to reproduce a 3D scene. In particular, a computer detects the distance to an object by measuring the shift in the object’s 2D location from one vantage point to another [4]. Comparing two images of the same object leads us naturally to explore optical correlators. Since the Fourier transform is lossless, we posit that any distance information encoded in the original images will be preserved in their correlation. In effect, the correlator automates the process of finding a common pixel in the algorithm outlined in [5]. Since we use a new pair of images for each distance measurement, the joint transform correlator is the appropriate design. This experiment is entirely exploratory in nature; we are only concerned with the plausibility of applying optical correlation to object distance detection, not the practicality. We hope, however, that this research may find applications in space, where the background is more or less static. Nevertheless, to the best of our knowledge, the use of a JTC to determine distance to an object is a novel approach. II. EXPERIMENT A. Setup Images were generated by two Microsoft LifeCam Cinema webcams aligned horizontally with 9.5” baseline separation between cameras (Fig. 1). LifeCam Cinema webcams have a 73° field of view and an autofocusing lens. This research was made possible by the National Science Foundation, Award #1359224, with support from the Department of Defense.
  • 2. Fig. 1. Microsoft LifeCam Cinema webcams used for data collection. Fig. 2. Post-It note on wall from 5ft away. The subject was a 2”x1.5” Post-It note on a wall (Fig. 2). We photograph a wall to minimize the confounding effect of the background on the correlation. To this end, we make sure the wall is as evenly lit as possible. The Post-It note was chosen to highly contrast the wall, making it easier for the JTC to identify the object. B. Procedure We measured the distance to the wall with a tape measure, then took a picture with each camera. We used ImageMagick to crop each image and then extract the value channel, producing a black-and-white square image for the correlator. Each input image served once as the “target” and once as the “filter.” The correlation program also attempted to detect correlation peaks using an algorithm developed by Dr. Marsh. The actual distance and peak data were exported into an Excel spreadsheet. The cameras would frequently fail to focus on the wall; this led to blurry input images that would not produce a strong correlation peak. These correlation results were thrown out. Frequently, a peak would be plainly visible but not strong enough for the algorithm to detect it. For example, a peak that occupies two pixels in the output image would slip through the detector since neither pixel has sufficient contrast with its neighbor, despite being clearly enough resolved to be a useful data point. In these cases, the peaks locations were entered manually. III. DATA Each stereo pair of input images (Fig. 3) produced a pair of correlation results (Fig. 4). Fig. 3. A pair of input images. Fig. 4. A pair of correlation results. The peak is the cross-shaped mark. Fig. 5. Coordinates of the peak vs. distance from wall. We plot the location of the peak versus the distance to the wall to find a clear relation (Fig. 5). Coordinates of the peak are measured from center, since distance to the wall should not depend on the size of the image. Note that we take the absolute value of the coordinates since swapping the input images negates the peak’s coordinates. A. Wrapping and Flipping Effect As distance to the object decreases, the X coordinate of the peak increases, up to a maximum of half the image size. With our testing environment, this occurs at around 1.5 feet. If actual distance to the object decreases beyond that, the peak will wrap around the image and then get both coordinates flipped: 0 50 100 150 0 2 4 6 8 10 12 |PeakX,Y| Distance Distance vs. |Peak X,Y| |Peak X| |Peak Y|
  • 3. Fig. 6. Effects of an off-image peak. B. Regression If we restrict our analysis to distances that do not produce the above “flipping” effect, we may invert the data and a regression becomes quite obvious (Fig. 7). Excel gives the following logarithmic model: . | | . It is important to note that this formula was generated under the specific conditions, including a 73° field of view and a 9.5 in baseline, but it is expected to lay the groundwork for a more general formula. With this formula we were able to determine distances from 2 ft to 5 ft, with an accuracy of ±3 in. Fig. 7. Absolute value of peak x vs distance. IV. CONCLUSIONS The work to date achieved its goal of proving that distance can be recovered from a joint transform correlation peak. While this work shows promise, further work is needed to elucidate and generalize the relationship between peak location and distance. This will move us closer to the ultimate goal of finding a formula relating Peak X, Peak Y, field of view, baseline, and distance. The wrapping and flipping effect presents a formidable challenge to this goal, and will likely be the next target of investigation. Another obvious direction future research will take is to test the effect of varying baseline on the location of the correlation peak. Finally, at this time the method has not been tested for practical applications like the CubeSat platform (for which this research was originally conceived). The work to date and near- future work is entirely proof of concept, and we may expect that applications follow once a strong foundation is laid. ACKNOWLEDGMENT This material is based upon work supported by the National Science Foundation Research Experience for Undergraduates under Grant No. (NSF 1359244). REFERENCES [1] A. Vander Lugt, "Signal detection by complex spatial filtering," IEEE Transactions on Information Theory, vol. 10, pp. 139-145. 1964. [2] C. S. Weaver and J. W. Goodman, "A Technique for Optically Convolving Two Functions," Applied Optics vol. 5, pp. 1248-1249. 1966. [3] A. J. Barry and R. Tedrake, "Pushbroom stereo for high-speed navigation in cluttered environments," Robotics and Automation (ICRA), 2015 IEEE International Conference on, Seattle, WA, 2015, pp. 3046-3052. [4] Jernej Mrovlje and Damir Vrančić, “Distance measuring based on stereoscopic pictures,” 9th International PhD Workshop on Systems and Control: Young Generation Viewpoint. 2008. [5] E. Tjandranegara, “Distance Estimation Algorithm for Stereo Pair Images,” Purdue ECE Tech. Rep., West Lafayette, IN, Rep. 64, 2005. [6] R. Mandelbaum, L. McDowell, L. Bogoni, B. Reich and M. Hansen, "Real-time stereo processing, obstacle detection, and terrain estimation from vehicle-mounted stereo cameras," Applications of Computer Vision, 1998. WACV '98. Proceedings., Fourth IEEE Workshop on, Princeton, NJ, 1998, pp. 288-289. y = -2.059ln(x) + 11.529 R² = 0.9812 0 2 4 6 8 10 12 0 20 40 60 80 100 120 Distance |Peak X| |Peak X| vs. Distance