Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
The information and mutual information ration for counting image features and their matches
1. The Information & Mutual Information Ratio
for Counting Image Features and Their Matches
Ali Khajegili Mirabadi
ECE Department
Isfahan University of Technology, Iran
Stefano Rini
ECE Department
National Chiao Tung University, Taiwan
IWCIT 2020
May 27, 2020
1
2. Outline
Introduction
• Image Features
• Image Matching Problem
Motivation
Information Ratio (IR)
Mutual Information Ratio (MIR)
Lower Bounds
• LIR
• LMIR
Numerical Experiments
• Evaluating the IR feature
• Evaluating the MIR feature
• Effectiveness of the features
Conclusion
Future Works 2
3. Handcrafted Feature Extraction:
• Oriented FAST and Rotated BRIEF (ORB)1
• Speeded Up Robust Features (SURF)2
• KAZE3
Deep Features:
• FASTER4
• V-FAST5
Introduction: Image Features
3
Local Feature Points
known as image information
1E. Rublee, et al., “Orb: An efficient alternative to sift or surf,” International conference on computer vision, 2011.
2H. Bay, et al., “Speeded-up robust features (surf),” Computer vision and image understanding, 2008.
3P. F. Alcantarilla, et al., “Kaze features,” in European Conference on Computer Vision, 2012.
4E. Rosten, et al., “Faster and better: A machine learning approach to corner detection,” IEEE TPAMI, 2008.
5T.-H. Yu, et al., “Real-time action recognition by spatiotemporal semantic and structural forests.” in BMVC, 2010.
4. Image Feature Points for Matching:
• Scale Invariant
• Rotation Invariant
• Viewpoint Invariant
• Robust to illumination and quality variation
Applications:
• Image Stitching (to produce a panorama or high-resolution image)
• Image Registration (alignment)
• 3D Reconstruction
• Depth Estimation
• Etc.
4
Introduction: Image Matching Problem
Matching
First frame Second frame
5. By now, there is no knowledge about:
• The count of feature points before feature extraction
• The count of matched feature points before matching process
Motivation
5
How many feature points exist in a given image, regardless of how features are described?
How many common features can be determined among two given images?
Objective: An estimation of these counts based on the image (channel) histogram
our approach is substantially different from the other approaches in the literature
Challenge:
Image histogram does not preserve that how the intensities are distributed over the image pixels
6. Definition:
• 𝑚 ≔ 0, … , 𝑚 − 1, 𝑚 = ℤ/(𝑚 + 1)ℤ
Color Space:
• standard RGB (Red, Green, and Blue)
Image Channel:
• 𝑿 𝑪 of the dimension 𝑁 × 𝑀, where 𝑪𝜖 𝑅, 𝐺, 𝐵
• each image pixel is itself a vector in [2 𝐷 − 1]3, where D is the color depth (usually 8)
The channel with the distribution 𝑷 𝑿 𝑪 (𝑖):
• ℎ𝑖(𝑪) is the image channel histogram for 𝑖 ∈ 2 𝐷 − 1
• ℎ𝑖,𝑗(𝑪) is the joint histogram of two consecutive channels 𝑿 𝟏 𝑪 and 𝑿 𝟐 𝑪 for 𝑖, 𝑗 ∈ 2 𝐷 − 1
• Probability mass function: 𝑝𝑖 𝑪 =
ℎ 𝑖 𝑪
𝑁𝑀
• Joint probability mass function: 𝑝𝑖,𝑗 𝑪 =
ℎ 𝑖,𝑗 𝑪
𝑁𝑀
Notation
6
8. In an analogous way as the IR image feature:
• by using the likelihood ratio and the previous procedure,
• 𝑚𝑖,𝑗 𝑪, 𝑿 𝟏; 𝑿 𝟐 ≔
log
𝑝 𝑖,𝑗 𝑪
𝑝 𝑖 𝑪 𝑝 𝑗 𝑪
log ℎ 𝑖,𝑗 𝑪
ℎ𝑖,𝑗 𝑪 > 1
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Summation over all intensities:
• 𝑚 𝑪, 𝑿 𝟏; 𝑿 𝟐 = 𝑁𝑀 × σ𝑖 ,𝑗∈[2 𝐷−1] 𝑝𝑖,𝑗 𝑪 . 𝑚𝑖,𝑗 𝑪, 𝑿 𝟏; 𝑿 𝟐
= σ𝑖 ,𝑗∈[2 𝐷−1] ℎ𝑖,𝑗 𝑪 . 𝑚𝑖,𝑗 𝑪, 𝑿 𝟏; 𝑿 𝟐 [𝑝𝑖𝑥𝑒𝑙]
Mutual Information Ratio (MIR)
8
9. A lower bound on the IR feature (LIR):
• 𝑟 𝑪, 𝑿 ≥
𝑁𝑀
log 𝑁𝑀
𝐻(𝑪) ≥ 0, where 𝐻(𝑪) is the sample entropy of the channel.
A lower bound on the MIR feature (LMIR):
• 𝑚 𝑪, 𝑿 𝟏; 𝑿 𝟐 ≥
𝑁𝑀
log 𝑁𝑀
መ𝐼 𝑪 ≥ 0, where መ𝐼 𝑪 is the sample mutual information of the
channel.
An inequality:
• when 𝐷 ≤
log 𝑁𝑀
2 log 2
, then both 𝑟 𝑪, 𝑿 and 𝑚 𝑪, 𝑿 𝟏; 𝑿 𝟐 ≤ 𝑁𝑀
Lower Bounds
9
10. Datasets:
• University of Oxford’s Affine Covariant Regions1
• INRIA Copydays2
Applied algorithms for evaluating:
• ORB
• KAZE
• SURF
Image Feature Distance (𝒅)
• Two features to be valid when they have minimum
distance greater than a chosen threshold 𝑑.
• 𝑑 ∈ {1, 8}
Numerical Experiments
10
1Available: http://www.robots.ox.ac.uk/ vgg/data,” 2004.
2Available: http://lear.inrialpes.fr/people/jegou/data.php,” 2008.
11. Procedure:
• Obtaining the IR feature count with varying levels of image brightness through a coefficient 𝐾
11
Effectiveness of the IR feature
12. Showing the effectiveness and applicability of the IR feature
IR-based optimization method:
• Maximum count of the extracted features is not necessarily for 𝐾 = 1
• Finding 𝐾 which maximizes the curve of the IR than 𝐾 ⇒ 𝐾 𝑂𝑝𝑡𝑖𝑚𝑖𝑧𝑒𝑟
• The computational complexity on average is 1.5ms
12
Effectiveness of the IR feature
Precision: *E03
13. Conclusion:
• The IR and MIR are two estimation of the image feature points and their matches, respectively
• Information entropy and mutual entropy are two lower bound of these features
• KAZE, SURF, and ORB algorithms follow the curve of the IR than coefficient 𝐾
• KAZE, SURF, and ORB can extract an amount of features lower than the IR and LIR features
• The IR and MIR are measures with the exact meaning of the information in literature
• The IR- based optimizer method outperforms applied feature extraction algorithms significantly
• LIR and LMIR are two features reliant on information entropy and mutual information but they are not
related to the normalized form of these information measures.
Future Works:
• Evaluating MIR and LMIR for more than 2 consecutive channels
• Applying these features in different computer vision algorithms optimization
Conclusion & Future Works
13