Representative Previous Work

  • 296 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
296
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
5
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Representative Previous Work
    PCA
    LDA
    ISOMAP: Geodesic
    Distance Preserving
    J. Tenenbaum et al., 2000
    LLE: Local Neighborhood
    Relationship Preserving
    S. Roweis & L. Saul, 2000
    LE/LPP: Local Similarity Preserving, M. Belkin, P. Niyogi et al., 2001, 2003
  • 2. Hundreds
    Dimensionality Reduction Algorithms
    Statistics-based
    Geometry-based

    PCA/KPCA
    ISOMAP
    LLE
    LE/LPP

    LDA/KDA
    Matrix
    Tensor
    Any common perspective to understand and explain these dimensionality reduction algorithms? Or any unified formulation that is shared by them?
    Any general tool to guide developing new algorithms for dimensionality reduction?
  • 3. Our Answers
    Direct Graph Embedding
    Linearization
    Kernelization
    Original PCA & LDA,
    ISOMAP, LLE,
    Laplacian Eigenmap
    PCA, LDA, LPP
    KPCA, KDA
    Tensorization
    Type
    Formulation
    CSA, DATER
    Example
    S. Yan, D. Xu, H. Zhang and et al., CVPR, 2005, T-PAMI,2007
  • 4. Direct Graph Embedding
    Intrinsic Graph:
    S, SP: Similarity matrix (graph edge)
    Similarity in high
    dimensional space
    L, B:Laplacian matrix from S, SP;
    Data in high-dimensional space and low-dimensional space (assumed as 1D space here):
    Penalty Graph
  • 5. Direct Graph Embedding -- Continued
    Intrinsic Graph:
    S, SP: Similarity matrix (graph edge)
    L, B:Laplacian matrix from S, SP;
    Similarity in high
    dimensional space
    Data in high-dimensional space and low-dimensional space (assumed as 1D space here):
    Criterion to Preserve Graph Similarity:
    Penalty Graph
    Special case B isIdentity matrix (Scale normalization)
    Problem: It cannot handle new test data.
  • 6. Linearization
    Intrinsic Graph
    Linear mapping function
    Penalty Graph
    Objective function in Linearization
    Problem: linear mapping function is not enough to preserve
    the real nonlinear structure?
  • 7. Kernelization
    Intrinsic Graph
    Nonlinear mapping:
    the original input space to another
    higher dimensional Hilbert space.
    Penalty Graph
    Constraint:
    Kernel matrix:
    Objective function in Kernelization
  • 8. Tensorization
    Low dimensional representation is
    obtained as:
    Intrinsic Graph
    Penalty Graph
    Objective function in Tensorization
    where
  • 9. Common Formulation
    S, SP: Similarity matrix
    Intrinsic graph
    L, B:Laplacian matrix from S, SP;
    Penalty graph
    Direct Graph Embedding
    Linearization
    Kernelization
    Tensorization
    where
  • 10. A General Framework for Dimensionality Reduction
    D: Direct Graph Embedding
    L:Linearization
    K: Kernelization
    T: Tensorization
  • 11. New Dimensionality Reduction Algorithm: Marginal Fisher Analysis
    Important Information for face recognition:
    1) Label information
    2) Local manifold structure
    (neighborhood or margin)
    1: ifxi is among the k1-nearest neighbors of xj in the same class;
    0 :otherwise
    1: if the pair (i,j) is among the k2 shortest pairs among the data set;
    0: otherwise
  • 12. Marginal Fisher Analysis: Advantage
    No Gaussian distribution assumption
  • 13. Experiments: Face Recognition
  • 14. Summary
    Optimization framework that unifies previous dimensionality reduction algorithms as special cases.
    A new dimensionality reduction algorithm: Marginal Fisher Analysis.
  • 15. Event Recognition in News Video
    • Online and offline video search
    • 16. 56 events are defined in LSCOM
    Airplane Flying
    Riot
    Existing Car
    Geometric and photometric variances
    Clutter background
    Complex camera motion and object motion
    More diverse !
  • 17. Earth Mover’s Distance in Temporal Domain(T-MM, Under Review)
    Key Frames of two video clips in class “riot”
    EMD can efficiently utilize the information from multiple frames.
  • 18. Multi-level Pyramid Matching (CVPR 2007, Under Review)
    • One Clip = several
    subclips (stages of event evolution) .
    • No prior knowledge about the number of stages in an event, and videos of the same event may include a subset of stage only.
    Smoke
    Level-1
    Fire
    Level-1
    Level-0
    Level-0
    Fire
    Level-1
    Smoke
    Level-1
    Solution: Multi-level Pyramid
    Matching in Temporal Domain
  • 19. Other Publications & Professional Activities
    Other Publications:
    • Kernel based Learning:
    Coupled Kernel-based Subspace Analysis: CVPR 2005
    Fisher+Kernel Criterion for Discriminant Analysis: CVPR 2005
    • Manifold Learning:
    Nonlinear Discriminant Analysis on Embedding Manifold : T-CSVT (Accepted)
    • Face Verification:
    Face Verification with Balanced Thresholds: T-IP (Accepted)
    • Multimedia:
    Insignificant Shadow Detection for Video Segmentation: T-CSVT 2005
    Anchorperson extraction for Picture in Picture News Video: PRL 2005
    Guest Editor:
    • Special issue on Video Analysis, Computer Vision and Image Understanding
    • 20. Special issue on Video-based Object and Event Analysis, Pattern RecognitionLetters
    Book Editor:
    • Semantic Mining Technologies for Multimedia Databases
    Publisher: Idea Group Inc. (www.idea-group.com)
  • 21. Future Work
    Machine Learning
    Event Recognition
    Biometric
    Computer Vision
    Pattern Recognition
    Web Search
    Multimedia Content
    Analysis
    Multimedia
  • 22. Acknowledgement
    Shuicheng Yan
    UIUC
    Steve Lin
    Microsoft
    Lei Zhang
    Microsoft
    Hong-Jiang Zhang
    Microsoft
    Shih-Fu Chang
    Columbia
    Xuelong Li
    UK
    Xiaoou Tang
    Hong Kong
    Zhengkai Liu, USTC
  • 23. Thank You very much!
  • 24. What is Gabor Features?
    Gabor features can improve recognition performance in comparison to grayscale features. Chengjun Liu T-IP, 2002
    Five Scales

    Input:
    Grayscale
    Image
    Eight Orientations
    Output:
    40 Gabor-filtered
    Images
    Gabor Wavelet Kernels
  • 25. How to Utilize More Correlations?
    Pixel Rearrangement
    Pixel
    Rearrangement
    Sets of highly
    correlated pixels
    Columns of highly
    correlated pixels
    Potential Assumption in Previous Tensor-based Subspace Learning:
    Intra-tensor correlations: Correlations among the features within certain
    tensor dimensions, such as rows, columns and Gabor features…
  • 26. Tensor Representation: Advantages
    Enhanced Learnability
    2. Appreciable reductions in computational costs
    3. Large number of available projection directions
    4. Utilize the structure information
  • 27. Connection to Previous Work –Tensorface (M. Vasilescu and D. Terzopoulos, 2002)
    From an algorithmic view or mathematics view, CSA and Tensorface are both variants of Rank-(R1,R2,…,Rn) decomposition.