MMV Research Laboratory: ARetrospective Around Multimedia and     Computer Vision Projects                   Ivan Cabezas ...
Content Universidad del Valle          A Brief in Figures Multimedia and Vision Laboratory          National Cooperati...
The Universidad del Valle              The Universidad del Valle is the largest university in the south west of          ...
Universidad del Valle: A Brief in Figures              Its main campus, Meléndez, has an extension of a million of square...
Multimedia and Vision Laboratory              MMV is a multidisciplinary research group of the EISC                      ...
National Cooperation John W. Branch, UNAL- Medellín Cesar Collazos, UniCauca - Popayán Fabio González, UNAL - Bogotá L...
Industrial CollaborationsMMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects   Slide 7
International Cooperation Ebroul Izquierdo Head of the Multimedia  and Vision Research Group, School of  Electronic Engin...
Research Interests Multimedia and Computer Vision                                                                        ...
A Camera Model              A Camera is a sensor following a model                   3D World                    Camera  ...
Some Research Projects MPEG-7 UV                                                       Prokaryota                       ...
MPEG-7 SOS: Motivation              How to retrieve images stored over large (and distributed) repositories?M. Florian an...
MPEG-7 SOS: Problem Statement              CBIR systems have some weaknesses                                Annotations: ...
MPEG-7 SOS: MPEG-7 StandardM. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008...
MPEG-7 SOS: The ProposalM. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008   ...
Char Morphology                 Char                  Resin                                                               ...
Char Morphology: Motivation         Energy generation based on coalhttp://www.iea.org/textbase/nppdf/free/2010/key_stats_...
Char Morphology: Inherent Problems &                                     Proposed Approach              Manual coal class...
Char Morphology: Inherent Problems &                                   Proposed Approach (ii)              Sampling and b...
An Evaluation Methodology for Stereo Correspondence Algorithms         Ivan Cabezas, Maria Trujillo and Margaret Florian  ...
Stereo Vision              The stereo vision problem is to recover the 3D structure of the scene using               two ...
Canonical Stereo Geometry and Disparity               Disparity is the distance between corresponding points             ...
Ground-truth Based Evaluation             Ground-truth based evaluation is based on the comparison using disparity       ...
Quantitative Evaluation Methodologies                                                                      The use of a m...
Middlebury’s Methodology               Select Test Bed Images                                                             ...
Middlebury’s Methodology (ii)        Select and Apply Stereo Algorithms                                                  A...
Middlebury’s Methodology (iii)                        Apply Evaluation Model                                              ...
Middlebury’s Methodology (iv): Weaknesses The Middlebury’s evaluation model have some shortcomings     In some cases, th...
Middlebury’s Methodology (v): Weaknesses             The BMP percentage measures the quantity of disparity estimation err...
A* Methodology             The A* evaluation methodology brings a theoretical background for the              comparison ...
A* Methodology (ii)             The evaluation model of the A* methodology addresses the comparison of              stere...
A* Methodology (iii): Pareto Dominance             The Pareto Dominance defines a partial order relation             VGC+...
A* Methodology (iv): Illustration               Select Test Bed Images                                                    ...
A* Methodology (v): Illustration The evaluation model performs the partitioning and the grouping of stereo  algorithms un...
A* Methodology (vi): Illustration Interpretation of results is based on the cardinality of the set A*         Apply Evalu...
A* Methodology (vii): Strength and Weakness             Strength: It allows a formal interpretation of results, based on ...
A* Groups Methodology It extends the evaluation model of the A* methodology, incorporating the  capability of performing ...
A* Groups Methodology (ii): Sigma-Z-Error             The A* Groups methodology uses the Sigma-Z-Error              (SZE)...
A* Groups Methodology (iii): Illustration   The evaluation process of selected algorithms by using the proposal     Selec...
A* Groups Methodology (iv): Illustration The evaluation model performs the partitioning and the grouping of stereo  algor...
A* Groups Methodology (v): Illustration                                                        Apply Evaluation Model    O...
A* Groups Methodology (vi): Illustration Interpretation of results is based on the cardinality of each group         Appl...
Experimental Results             The conducted evaluation involves the following elements:                 Test Bed Image...
Experimental Results (ii)                                                                           Algorithm        Group...
Conclusions The use of the A* Groups methodology allows to perform an exhaustive  evaluation, as well as an objective int...
An Evaluation Methodology for Stereo Correspondence Algorithms         Ivan Cabezas, Maria Trujillo and Margaret Florian  ...
Final Remarks More information about the MMV-Lab can be found at  http://www.slideshare.net/mmv-lab-univalle We are look...
MMV Research Laboratory: ARetrospective Around Multimedia and     Computer Vision Projects                   Ivan Cabezas ...
Upcoming SlideShare
Loading in …5
×

MMV Research Laboratory :A Retrospective Around Multimedia and Computer Vision Projects

1,089 views

Published on

A brief retrospective of selected projects elaborated at the Multimedia and Vision Laboratory in the Universidad del Valle. This talk was presented by teleconference to Universidad Señor de Sipán, Peru.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

MMV Research Laboratory :A Retrospective Around Multimedia and Computer Vision Projects

  1. 1. MMV Research Laboratory: ARetrospective Around Multimedia and Computer Vision Projects Ivan Cabezas ivan.cabezas@correounivalle.edu.co July 18th 2012 Universidad Señor de Sipán – Chiclayo, Peru
  2. 2. Content Universidad del Valle  A Brief in Figures Multimedia and Vision Laboratory  National Cooperation  Industrial Collaboration  International Cooperation  Research Interests A Camera Model Some Research Projects MPEG7 - SOS Char Morphology An Evaluation Methodology for Stereo Correspondence Algorithms Final Remarks MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 2
  3. 3. The Universidad del Valle  The Universidad del Valle is the largest university in the south west of Colombiahttp://www.univalle.edu.co MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 3
  4. 4. Universidad del Valle: A Brief in Figures  Its main campus, Meléndez, has an extension of a million of square meters  There are two campus in Cali, and nine regionals in Valle and Cauca  There are 187 study programs offered in Cali, most of them for graduate  There are six faculties and two institutes  At February of 2012, it had a population of 27094 students (88.7% undergraduate)  At December of 2011, it had 889 full time professors (92% graduate, 30% PhD)http://www.univalle.edu.co MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 4
  5. 5. Multimedia and Vision Laboratory  MMV is a multidisciplinary research group of the EISC Meetings, 2007 & 2011 INTERACTIVIA, 2009 Maria at UNAL Ivan at WAC 2011 2012 LACNEM, 2009http://www.lacnem.org MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 5
  6. 6. National Cooperation John W. Branch, UNAL- Medellín Cesar Collazos, UniCauca - Popayán Fabio González, UNAL - Bogotá Liliana Salazar Escuela de Ciencias Básicas Doris Hinestroza Departamento de Matemáticas Juan Barraza Escuela de Ingeniería Química Janet Sanabria, Escuela de Recursos Naturales y del Ambiente MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 6
  7. 7. Industrial CollaborationsMMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 7
  8. 8. International Cooperation Ebroul Izquierdo Head of the Multimedia and Vision Research Group, School of Electronic Engineering and Computer Science, Queen Mary University of London Aggelos Katsaggelos, Director Motorola Center for Seamless Communications, Northwestern University, USA Panos Liatsis, Head of the Information Engineering and Medical Imaging Group, School of Engineering and Mathematical Sciences, City University London Sergio Velastin, Director Digital Imaging Research Centre, Kingston University, UK Valia Guerra, Instituto de Cibernética, Matemática y Física (ICIMAF), Cuba MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 8
  9. 9. Research Interests Multimedia and Computer Vision Images Computing System Information Computer Vision http://www.slideshare.net/mmv-lab-univalle http://vision.mas.ecp.fr/Personnel/teboul/index.php/ MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 9
  10. 10. A Camera Model  A Camera is a sensor following a model 3D World Camera System 2D Imageshttp://www.univalle.edu.cohttp://quarknet.fnal.gov/fnal-uc/quarknet-summer-research/QNET2010/Astronomy/ http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/FUSIELLO4/tutorial.html MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 10
  11. 11. Some Research Projects MPEG-7 UV  Prokaryota Clusters: Espacial + K - Means MPEG-7 SOS  Vitisoft MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 11
  12. 12. MPEG-7 SOS: Motivation  How to retrieve images stored over large (and distributed) repositories?M. Florian and M. Trujillo, Relational Database Schema for MPEG-7 Visual Descriptors, IEEE CBIR, 2008M. Florian and M. Trujillo, Resource Oriented Architecture for Managing Multimedia Content, LACNEM, 2009M. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 12
  13. 13. MPEG-7 SOS: Problem Statement  CBIR systems have some weaknesses Annotations: wild life, horses, chevaux, potros …M. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008http://cs.usu.edu/htm/REU-Current-Projects MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 13
  14. 14. MPEG-7 SOS: MPEG-7 StandardM. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 14
  15. 15. MPEG-7 SOS: The ProposalM. Florian MPEG-7 Service Oriented System, Master Research Project, Universidad del Valle, 2008 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 15
  16. 16. Char Morphology Char Resin Particle Classification 1 Crassisphere 2 Inertoid … 9 Mineroid Microscopy CameraD. Chaves and M. Trujillo Impacto del Muestreo en la Clasificación de Carbonizados de Carbón, 5 CCC, 2010 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 16
  17. 17. Char Morphology: Motivation  Energy generation based on coalhttp://www.iea.org/textbase/nppdf/free/2010/key_stats_2010.pdf http://www.worldcoal.org/coal/where-is-coal-found/ MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 17
  18. 18. Char Morphology: Inherent Problems & Proposed Approach  Manual coal classification is a subjective and resources consuming process Automatic ClassificationD. Chaves and M. Trujillo Impacto del Muestreo en la Clasificación de Carbonizados de Carbón, 5 CCC, 2010 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 18
  19. 19. Char Morphology: Inherent Problems & Proposed Approach (ii)  Sampling and blurred or images with no content has to be consideredD. Chaves and M. Trujillo Identificación Automática de Imágenes de Carbonizado Borrosas y con poco contenido, CONICA, 2012 MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 19
  20. 20. An Evaluation Methodology for Stereo Correspondence Algorithms Ivan Cabezas, Maria Trujillo and Margaret Florian ivan.cabezas@correounivalle.edu.co February 25th 2012International Conference on Computer Vision Theory and Applications, VISAPP 2012, Rome - Italy
  21. 21. Stereo Vision  The stereo vision problem is to recover the 3D structure of the scene using two or more images 3D World Optics Problem Camera Inverse System Problem Disparity Map Reconstruction Algorithm 2D Images Left Right Correspondence Stereo Images Algorithm 3D ModelYang Q. et al., Stereo Matching with Colour-Weighted Correlation, Hierarchical Belief Propagation, and Occlusion Handling, IEEE PAMI 2009 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 21
  22. 22. Canonical Stereo Geometry and Disparity  Disparity is the distance between corresponding points Accurate Estimation Inaccurate Estimation P P P’ Z Z’ pl pr pl pr πl πr πl πr pr ’ f f Cl B Cr Cl B CrTrucco, E. and Verri A., Introductory Techniques for 3D Computer Vision, Prentice Hall 1998 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 22
  23. 23. Ground-truth Based Evaluation  Ground-truth based evaluation is based on the comparison using disparity ground-truth dataScharstein, D. and Szeliski, R., High-accuracy Stereo Depth Maps using Structured Light, CVPR 2003Tola, E., Lepetit, V. and Fua, P., A Fast Local Descriptor for Dense Matching, CVPR 2008Strecha, C., et al. On Benchmarking Camera Calibration and Multi-View Stereo for High Resolution Imagery, CVPR 2008http://www.zf-usa.com/products/3d-laser-scanners/ An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 23
  24. 24. Quantitative Evaluation Methodologies  The use of a methodology allows to:  Assert specific components and procedures  Tune algorithms parameters  Support decision for researchers and practitioners  Measure the progress on the fieldSzeliski, R., Prediction Error as a Quality Metric for Motion and Stereo, ICCV 2000Kostliva, J., Cech, J., and Sara, R., Feasibility Boundary in Dense and Semi-Dense Stereo Matching, CVPR 2007Tomabari, F., Mattoccia, S., and Di Stefano, L., Stereo for robots: Quantitative Evaluation of Efficient and Low-memory Dense Stereo Algorithms, ICCARV 2010Cabezas, I. and Trujillo M., A Non-Linear Quantitative Evaluation Approach for Disparity Estimation, VISAPP 2011 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 24
  25. 25. Middlebury’s Methodology Select Test Bed Images Select Error Criteria nonocc all disc Select and Apply Stereo Algorithms Select Error Measures ObjectStereo GC+SegmBorder PUTv3 Compute Error Measures PatchMatch ImproveSubPix OverSegmBPScharstein, D. and Szeliski, R., High-accuracy Stereo Depth Maps using Structured Light, CVPR 2003Scharstein, D. and Szeliski, R., http://vision.middlebury.edu/stereo/eval/, 2012 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 25
  26. 26. Middlebury’s Methodology (ii) Select and Apply Stereo Algorithms Apply Evaluation Model Algorithm nonocc all disc ObjectStereo 2.20 1 6.99 2 6.36 1 GC+SegmBorder 4.99 6 5.78 1 8.66 5 Compute Error Measures PUTv3 2.40 2 9.11 6 6.56 2 PatchMatch 2.47 3 7.80 3 7.11 3 ImproveSubPix 2.96 4 8.22 4 8.55 4 Algorithm nonocc all disc OverSegmBP 3.19 5 8.81 5 8.89 6 ObjectStereo 2.20 6.99 6.36 GC+SegmBorder 4.99 5.78 8.66 PUTv3 2.40 9.11 6.56 Algorithm Average Final PatchMatch 2.47 7.80 7.11 Rank Ranking ImproveSubPix 2.96 8.22 8.55 ObjectStereo 1.33 1 OverSegmBP 3.19 8.81 8.89 PatchMatch 3.00 2 PUTv3 3.33 3 GC+SegmBorder 4.00 4 ImproveSubPix 4.00 5 OverSegmBP 5.33 6Scharstein, D. and Szeliski, R., http://vision.middlebury.edu/stereo/eval/, 2012 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 26
  27. 27. Middlebury’s Methodology (iii) Apply Evaluation Model Interpret Results The ObjectStereo algorithm produces accurate results Middlebury’s Evaluation Model Algorithm Average Final Rank Ranking ObjectStereo 1.33 1 PatchMatch 3.00 2 PUTv3 3.33 3 GC+SegmBorder 4.00 4 ImproveSubPix 4.00 5 OverSegmBP 5.33 6Scharstein, D. and Szeliski, R., A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms, IJCV 2002Scharstein, D. and Szeliski, R., http://vision.middlebury.edu/stereo/eval/, 2012 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 27
  28. 28. Middlebury’s Methodology (iv): Weaknesses The Middlebury’s evaluation model have some shortcomings  In some cases, the ranks are assigned arbitrarily  The same average ranking does not imply the same performance (and vice versa)  The cardinality of the set of top-performer algorithms is a free parameter  It operates values related to incommensurable measures An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 28
  29. 29. Middlebury’s Methodology (v): Weaknesses  The BMP percentage measures the quantity of disparity estimation errors exceeding a threshold  The BMP measure have some shortcomings:  It is sensitive to the threshold selection  It ignores the error magnitude  It ignores the inverse relation between depth and disparity  It may conceal estimation errors of a large magnitude, and, also it may penalise errors of small impact in the final 3D reconstructionCabezas, I., Padilla, V., and Trujillo M., A Measure for Accuracy Disparity Maps Evaluation, CIARP 2011Gallup, D., et al. Variable Baseline/Resolution Stereo, CVPR, 2008 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 29
  30. 30. A* Methodology  The A* evaluation methodology brings a theoretical background for the comparison of stereo correspondence algorithms  The set of algorithms under evaluation  The set of estimated maps to be compared  The function that produces a vector of error measures  The set of vectors of error measuresCabezas, I. and Trujillo M., A Non-Linear Quantitative Evaluation Approach for Disparity Estimation, VISAPP 2011 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 30
  31. 31. A* Methodology (ii)  The evaluation model of the A* methodology addresses the comparison of stereo correspondence algorithms as a multi-objective optimisation problem  It defines a partition over the set A (the decision space)  Subject to:  where ≺ denotes the Pareto Dominance relation: Let p and q be two algorithms Let Vp and Vq be a pair of vectors belonging to the objective space Thus, three possible relations are consideredCabezas, I. and Trujillo M., A Non-Linear Quantitative Evaluation Approach for Disparity Estimation, VISAPP 2011 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 31
  32. 32. A* Methodology (iii): Pareto Dominance  The Pareto Dominance defines a partial order relation VGC+SegmBorder = < 50.48, 64.90, 24.33> VPatchMatch = < 49.95, 261.84, 32.85> VImproveSubPix = < 50.66, 97.94, 32.01> VGC+SegmBorder VPatchMatch < 50.48, 64.90, 24.33> < 49.95, 261.84, 32.85> GC+SegmBorder ~ PatchMatch VGC+SegmBorder VImproveSubPix < 50.48, 64.90, 24.33> < 50.66, 97.94, 32.01> GC+SegmBorder ≺ ImproveSubPixVan Veldhuizen, D., et al., Considerations in Engineering Parallel Multi-objective Evolutionary Algorithms, Trans in Evolutionary Computing 2003 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 32
  33. 33. A* Methodology (iv): Illustration Select Test Bed Images Select Error Criteria nonocc all disc Select and Apply Stereo Algorithms Select Error Measures ObjectStereo GC+SegmBorder PUTv3 Compute Error Measures PatchMatch ImproveSubPix OverSegmBPScharstein, D. and Szeliski, R., High-accuracy Stereo Depth Maps using Structured Light, CVPR 2003Scharstein, D. and Szeliski, R., http://vision.middlebury.edu/stereo/eval/, 2012 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 33
  34. 34. A* Methodology (v): Illustration The evaluation model performs the partitioning and the grouping of stereo algorithms under evaluation, based on the Pareto Dominance relation Compute Error Measures Apply Evaluation Model Algorithm nonocc all disc ObjectStereo 2.20 6.99 6.36 GC+SegmBorder , PatchMatchGC+SegmBorder 4.99 5.78 8.66 PUTv3 2.40 9.11 6.56 ObjectStereo , PUTv3 , ImproveSubPix , OverSegmBP PatchMatch 2.47 7.80 7.11ImproveSubPix 2.96 8.22 8.55 Algorithm nonocc all disc Set OverSegmBP 3.19 8.81 8.89 GC+SegmBorder 50.48 64.90 24.33 A* PatchMatch 49.95 261.84 32.85 A* PUTv3 99.67 333.37 53.79 A’ ImproveSubPix 50.66 97.94 32.01 A’ OverSegmBP 58.65 108.60 34.58 A’ ObjectStereo 73.88 117.90 36.25 A’ An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 34
  35. 35. A* Methodology (vi): Illustration Interpretation of results is based on the cardinality of the set A* Apply Evaluation Model Interpret Results A* Evaluation Model The GC+SegmBorder and the PatchMatch Algorithm nonocc all disc Set algorithms are, comparable among them, and have a superior performance to the restGC+SegmBorder 50.48 64.90 24.33 A* of algorithms PatchMatch 49.95 261.84 32.85 A*ImproveSubPix 50.66 97.94 32.01 A’ OverSegmBP 58.65 108.60 34.58 A’ ObjectStereo 73.88 117.90 36.25 A’ PUTv3 99.67 333.37 53.79 A’ An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 35
  36. 36. A* Methodology (vii): Strength and Weakness  Strength: It allows a formal interpretation of results, based on the cardinality of the set A*, and in regard to considered imagery test-bed  Weakness: It does not allow an exhaustive evaluation of the entire set of algorithms under evaluation  It computes the set A* just once, and does not bring information about A’Cabezas, I. and Trujillo M., A Non-Linear Quantitative Evaluation Approach for Disparity Estimation, VISAPP 2011 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 36
  37. 37. A* Groups Methodology It extends the evaluation model of the A* methodology, incorporating the capability of performing an exhaustive evaluation subject to:  It introduces the partitioningAndGrouping algorithm A = Set ( { } ); A.load( “Algorithms.dat” ); A* = Set ( { } ); A’ = Set ( { } ); group = 1; do { computePartition( A, A*, A’, g, ≺ ); A*.save ( “A*_group_”+group ); group++; A.update ( A’ ); // A = A / A* A*.removeAll ( ); // A* = { } A’.removeAll ( ); // A’ = { } }while ( ! A.isEmpty ( ) ); An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 37
  38. 38. A* Groups Methodology (ii): Sigma-Z-Error  The A* Groups methodology uses the Sigma-Z-Error (SZE) measure  The SZE measure has the following properties:  It is inherently related to depth reconstruction in a stereo system  It is based on the inverse relation between depth and disparity  It considers the magnitude of the estimation error  It is threshold freeCabezas, I., Padilla, V., and Trujillo M., A Measure for Accuracy Disparity Maps Evaluation, CIARP 2011 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 38
  39. 39. A* Groups Methodology (iii): Illustration  The evaluation process of selected algorithms by using the proposal Select Test Bed Images Select Error Criteria nonocc all discSelect and Apply Stereo Algorithms Select Error Measures ObjectStereo GC+SegmBorder PUTv3 Compute Error Measures PatchMatch ImproveSubPix OverSegmBP An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 39
  40. 40. A* Groups Methodology (iv): Illustration The evaluation model performs the partitioning and the grouping of stereo algorithms under evaluation, based on the Pareto Dominance relation Compute Error Measures Apply Evaluation Model Algorithm nonocc all disc ObjectStereo 73.88 117.90 36.25 GC+SegmBorder ,, PatchMatchGC+SegmBorder 50.48 64.90 24.33 PUTv3 2.40 9.11 6.56 ObjectStereo , PUTv3 , ImproveSubPix , OverSegmBP PatchMatch 49.95 261.84 32.85 Algorithm nonocc all disc GroupImproveSubPix 50.66 97.94 32.01 OverSegmBP 58.65 108.60 34.58 GC+SegmBorder 50.48 64.90 24.33 1 PatchMatch 49.95 261.84 32.85 1 PUTv3 99.67 333.37 53.79 ImproveSubPix 50.66 97.94 32.01 OverSegmBP 58.65 108.60 34.58 ObjectStereo 73.88 117.90 36.25 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 40
  41. 41. A* Groups Methodology (v): Illustration Apply Evaluation Model ObjectStereo , PUTv3, ImproveSubPix , OverSegmBP Algorithm nonocc all disc ObjectStereo, PUTv3 , OverSegmBP PUTv3 99.67 333.37 53.79 Algorithm nonocc all disc ImproveSubPix 50.66 97.94 32.01 PUTv3 99.67 333.37 53.79 OverSegmBP 58.65 108.60 34.58 OverSegmBP 58.65 108.60 34.58 ObjectStereo 73.88 117.90 36.25 ObjectStereo 73.88 117.90 36.25 OverSegmBP ImproveSubPix PUTv3 , ObjectStereo ObjectStereo , PUTv3 , OverSegmBP Algorithm nonocc all disc Group Algorithm nonocc all disc Group OverSegmBP 58.65 108.60 34.58 3 PUTv3 99.67 333.37 53.79ImproveSubPix 50.66 97.94 32.01 2 ObjectStereo 73.88 117.90 36.25 PUTv3 99.67 333.37 53.79ObjectStereo 73.88 117.90 36.25OverSegmBP 58.65 108.60 34.58 And so on … An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 41
  42. 42. A* Groups Methodology (vi): Illustration Interpretation of results is based on the cardinality of each group Apply Evaluation Model Interpret Results A* Groups Evaluation Model There are 5 groups of different performance The GC+SegmBorder and the PatchMatch algorithms are, comparable among them, and have a superior performance to the rest Algorithm nonocc all disc Group of algorithmsGC+SegmBorder 50.48 64.90 24.33 1 The ImproveSubPix algorithm is superior to PatchMatch 49.95 261.84 32.85 1 the OverSegmBP, the ObjectStereo, andImproveSubPix 50.66 97.94 32.01 2 the PUTv3 algorithms OverSegmBP 58.65 108.60 34.58 3 … ObjectStereo 73.88 117.90 36.25 4 PUTv3 99.67 333.37 53.79 5 The PUTv3 algorithm has the lowest performance An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 42
  43. 43. Experimental Results  The conducted evaluation involves the following elements: Test Bed Images Error Criteria nonocc , all , disc Error Measures SZE , BMP Stereo Algorithms 112 algorithms from the Middlebury’s repository Evaluation Models A* Groups MiddleburyScharstein, D. and Szeliski, R., http://vision.middlebury.edu/stereo/eval/, 2012 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 43
  44. 44. Experimental Results (ii) Algorithm Group Middlebury’s Ranking ADCensus 2 1 AdaptingBP 2 2 CoopRegion 2 3 DoubleBP 1 4 RDP 2 5 OutlierConf 2 6 Algorithm Strategy Group Middlebury’s SubPixDoubleBP 2 7 Ranking SurfaceStereo 2 8 DoubleBP Global 1 4 WarpMat 2 9 PatchMatch Local 1 11 ObjectStereo 2 10GC+SegmBorder Global 1 13 PatchMatch 1 11 FeatureGC Global 1 18 Undr+OverSeg 2 12 Segm+Visib Global 1 29 GC+SegmBorder 1 13 MultiresGC Global 1 30 InfoPermeable 2 14 DistinctSM Local 1 34 CostFilter 2 15 GC+occ Global 1 67 MultiCamGC Global 1 68 An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 44
  45. 45. Conclusions The use of the A* Groups methodology allows to perform an exhaustive evaluation, as well as an objective interpretation of results Innovative results in regard to the comparison of stereo correspondence algorithms were obtained using proposed methodology and the SZE error measure The introduced methodology offers advantages over the conventional approaches to compare stereo correspondence algorithms Authors are already working in order to provide to the research community an accessible way to use the introduced methodology An Evaluation Methodology for Stereo Correspondence Algorithms, VISAPP 2012, Rome - Italy Slide 45
  46. 46. An Evaluation Methodology for Stereo Correspondence Algorithms Ivan Cabezas, Maria Trujillo and Margaret Florian ivan.cabezas@correounivalle.edu.co February 25th 2012International Conference on Computer Vision Theory and Applications, VISAPP 2012, Rome - Italy
  47. 47. Final Remarks More information about the MMV-Lab can be found at http://www.slideshare.net/mmv-lab-univalle We are looking forward to create bounds with international collaborators We invite you to participate at the 4th Latin American Conference on Networked and Electronic Media, LACNEM, in Chile next October If you have any question or concern please do not hesitate to contact me ivan.cabezas@correounivalle.edu.co / www.ivancabezas.com MMV Research Laboratory: A Retrospective Around Multimedia and Computer Vision Projects Slide 47
  48. 48. MMV Research Laboratory: ARetrospective Around Multimedia and Computer Vision Projects Ivan Cabezas ivan.cabezas@correounivalle.edu.co July 18th 2012 Universidad Señor de Sipán – Chiclayo, Peru

×