Pattern classification
Upcoming SlideShare
Loading in...5

Like this? Share it with your network


Pattern classification






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

Pattern classification Presentation Transcript

  • 1. Pattern Classification using 2-D Cellular Automata
  • 2. Cellular Automata -A discrete dynamical system that consists of : Each cell in of theN-dimensional grid finite number of of cells statesCells constituting Collection of rulesthe neighbour of determining the the cell in state of the cell in consideration the next instant
  • 3. Purely local decisions Collective behavior of Behind every parallelism simple cells complex behavior lies simple logic- Cellular Automata Works well No with less modeling data constraints • finds application in effective patternCellular Automata classification
  • 4. Meaningful interpretation of voluminous data-DATACLASSIFICATIONPresented algorithm maps data set instance to pixelposition ( 2-dimensional in this case)- PATTERNCLASSFICATIONPattern classification was proposed as an application ofSweeper’s Algorithm(SA)Combined use of SA and Majority rule as an attempt toclassify iris dataset - define regions (patterns)corresponding to distinct disjoint classes of the dataset.
  • 5. Classification Algorithm Step 1 : Mapping instances to pattern• x = average Matrix, ‘M’ • Mset (x, y)= Setosa.bmp (sepal formation length, sepal 1, M(x,y)=1 width) x 100 0, otherwise• y = average • M(x, y) = • Mversi(x, y) = (petal 2, M(x,y)=2 length, petal 1, setosa 0, otherwise width) x 100 • Mvir(x, y) = 2, versicolor Mapping 3, virginica 3, M(x,y)=3 0, otherwise Versicolor.bmp instances to 0, otherwise pixel Divide position matrix M, into 3Example An instance x = (5.1 + 3.5)/2 (5.1, 3.5, 1. × 100 = 430 M(430, Virginica.bmp 4, 0.2) - y = (1.4 + 0.2 )/2 80) = 1 Setosa × 100 = 80
  • 6. Classification Algorithm Step 2 : Application of Sweeper’s Algorithm Sweeper’s Algorithm Null boundary, 9 neighborhood, hybrid 2-D CA Given a destination point (x , y) Consider an axis passing through the point dividing the 2-D search region into two ( refer figure alongside)Rotate the axis through an angle of rotation,45 degrees in this case, forming 4 such set of regionsIn each iteration apply hybrid rules to each of these sets of regions, aimed at bringing the marked pixels a step closer to the axisAs a result, after certain number of iterations, all marked pixels accumulate around the destination point.
  • 7. Classification Algorithm Step 2 continued … Setosa.bmp Apply Sweeper’s Algorithm (application of hybrid 2-D CArules to ‘sweep’ points near to asingle destination point) to each Versicolor.bmp .bmp image files with Centroid of pixels in white as destination, for each image Virginica.bmpCombine corresponding matricesinto one: M‘ = Mset + Mversi + Mvir Combined.bmpWrite to ‘combined.bmp’ image file
  • 8. Classification Algorithm Step 3 : Application of Majority Rule Majority Rule: 0 : n1 + n2 + n3 = 0Next state of cell = 1 : n1 > n2 and n1 > n3(applied to each pixel) 2 : n2 > n1 and n2 > n3 3 : n3 > n1 and n3 > n2 Rand(1, 2): n1 = n2 > n3• Moore neighborhood Rand(1, 3): n1 = n3 > n2 Verification:• Null boundary Rand(2, 3): n2 = n3 > n1• Uniform Rand(1, 2, 3): n1 = n2 =n3 ≠ 0 • Map instance to pixel• Here, ni is the numbers of neighbors of class ‘I’. • Check for the color assigned to the corresponding element (pixel) in matrix • If match , increase counter • Efficiency = (total number of Assign different colors : matches /total number of M’i,j = 3 instances)x 100 M’i,j = 1 setosa viriginica Conclusion: M’i,j = 2 • Linear and non-linear versicolor instance space classifier
  • 9. • With ‘k’ = number ofinstances from eachclass, as the training set• 5 simulations for eachvalue of ‘k’• Averaging the efficiencyfor all k , we obtain graphas : Efficiency : Complexity: Efficiency in the range of 2-dimensional grid in scale of 97.6± 1.5 hundredEfficiency nearly the same even 45 iterations of Sweeper’s with lesser value for k Algorithm and nearly 200 iterations of Majority ruleEfficiency = 100% when classifysetosa and versicolor compared cosumes nearly 21 mins. cpu to 99± 3.2 in voting rule time in a serial processor
  • 10. • An efficient linear as well as non-linear classifier (with efficiency nearly 97.6 ± 1.5 %)• Sweeper preserves the pattern of region a class occupies• Maps attributes to 2-dimensional grid resulting in loss of information, yet efficient.• Generalize the algorithm for any data set• Reduce time complexity• Optimize sweeper’s algorithm for minimum number of iterations• Optimum selection of destination point to improvise the algorithm
  • 11. THANK YOU