Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and                      Applications (IJER...
Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and                         Applications (I...
Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and                  Applications (IJERA) I...
Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and                   Applications (IJERA) ...
Upcoming SlideShare
Loading in …5
×

Ao25246249

129 views
104 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
129
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Ao25246249

  1. 1. Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.246-249 A Innovative method for Image retrieval using perceptual textural features Prathusha.Perugu1, Krishna veni 2 1 M.Tech Student , Kottam College of Engineering ,Kurnool,AP.2 Asst. Prof. ,Dept. of Computer Science and Engineering, Kottam College of Engineering , Tekur, Kurnool,A.PAbstract : Texture is a very important image For images containing repetitive texture patterns, thefeature extremely used in various image autocorrelation function exhibits periodic behaviorprocessing problems like Computer Vision with the same period as in the original image. For,Image Representation and Retrieval It has been coarse textures, the autocorrelation functionshown that humans use some perceptual textural decreases slowly, whereas for fine textures itfeatures to distinguish between textured images decreases rapidly For images containingor regions. Some of the most important features orientation(s), the autocorrelation function saves theare coarseness, contrast, direction and busyness. same orientation(s).In this paper a study is made on these texturefeatures. The proposed computational measures In this paper the test images are taken fromcan be based upon two representations: the Broadatz database [2] and the autocorrelationoriginal images representation and the function is applied on original images shown inautocorrelation function (associated with original figure 1 . The original images are scaled in to nineimages) representation. parts and autocorrelation function is applied on the one of the scaled images randomly. Scaled imagesKeywords : Texture features, Content Based Image are shown in figure 2.Retrieval, Computational measures I. IntroductionTexture has been extensively studied and used inliterature since it plays a very important role inhuman visual perception. Texture refers to thespatial distribution of grey-levels and can be definedas the deterministic or random repetition of one or Fig :1 Original images from Broadatz Databaseseveral primitives in an image.Texture analysis techniques can be divided into two Any image of the scaled images can be selected andmain categories: spatial techniques and frequency- the autocorrelation function is applied on the scaledbased techniques. Frequency based methods include image.the Fourier transform and the wavelet-basedmethods such as the Gabor model. Spatial textureanalysis methods can be categorized as statisticalmethods, structural methods, or hybrid methods.It is widely admitted in the computer visioncommunity that there is a set of textural features thathuman beings use to recognize and categorizetextures. Such features include coarseness, contrastand directionality. II. Autocorrelation functionThe Auto correlation Function [1] denoted as 𝑓 𝛿𝑖, 𝛿𝑗 for an image I of nXm dimensions isdefined as 1 𝑓 𝛿𝑖, 𝛿𝑗 = 𝑋 Figure 3 shows the randomly selected scaled image 𝑛 −𝛿𝑖 𝑚 −𝛿𝑗 𝑛−𝛿𝑖 −1 𝑚−𝛿𝑗 −1 from the scaled images and its histogram. 𝑖=0 𝑗 =0 𝐼 𝑖, 𝑗 𝐼(𝑖 + 𝛿𝑖, 𝑗 + 𝛿𝑗) (1) 246 | P a g e
  2. 2. Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.246-249 primitives and is characterized by a high degree of 12000 local variations of grey-levels. First we compute the 10000 first derivative of the autocorrelation function in a 8000 separable way according to rows and columns, respectively.Two functions 𝐶𝑥 𝑖, 𝑗 𝑎𝑛𝑑 𝐶𝑦 𝑖, 𝑗 are 6000 then obtained 4000 𝐶𝑥 𝑖, 𝑗 = 𝑓 𝑖, 𝑗 − 𝑓 𝑖 + 1, 𝑗 2000 𝐶𝑦 𝑖, 𝑗 = 𝑓 𝑖, 𝑗 − 𝑓 𝑖, 𝑗 + 1 (2) 0 0 100 200 Second, we compute the first derivatives of the obtained functions 𝐶𝑥(𝑖, 𝑗)and 𝐶𝑦 𝑖, 𝑗 in a Fig :3 Randomly selected scaled image and its separable way according to rows and columns. Two histogram functions 𝐶𝑥𝑥(𝑖, 𝑗) and 𝐶𝑦𝑦(𝑖, 𝑗)are then obtained 𝐶𝑥𝑥 𝑖, 𝑗 = 𝐶𝑥 𝑖, 𝑗 − 𝐶𝑥 𝑖 + 1, 𝑗 After applying the autocorrelation fucntion the 𝐶𝑦𝑦 𝑖, 𝑗 = 𝐶𝑦 𝑖, 𝑗 − 𝐶𝑦(𝑖, 𝑗 + 1) (3) image is shown in figure 4 and its equivalent histogram. To detect maxima, we use the following equations 4 x 10 (according to rows and columns, respectively): 3 2.5 2 𝐶𝑥 𝑖, 𝑗 = 0 1.5 1 0.5 𝐶𝑥𝑥 𝑖, 𝑗 < 0 (4) 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 𝐶𝑦 𝑖, 𝑗 = 0 Fig:4 Autocorrelation applied on scaled image and its histogram 𝐶𝑦𝑦 𝑖, 𝑗 < 0 (5)III. Computational measures for Textural Coarseness, denoted ,Cs is estimated as the average number of maxima in the autocorrelation function: a features coarse texture will have a small number of maxima Perceptual features ave been largely used and a fine texture will have a large number of both in texture retrieval and content-based image maxima. Let 𝑀𝑎𝑥𝑥 𝑖, 𝑗 = 1if pixel (i,j) is a retrieval (CBIR)[4] systems like: maximum on rows and𝑀𝑎𝑥𝑥 𝑖, 𝑗 = 0 if pixel is not • IBM QBIC system which implements Tamura features. a maximum on rows. Similarly Let 𝑀𝑎𝑥𝑦 𝑖, 𝑗 = • Candid by the Los Alamos National Lab’s by 1 if pixel is a maximum on columns and Kelly et alii using Laws filters . 𝑀𝑎𝑥𝑦 𝑖, 𝑗 = 0 if pixel is not a maximum on • Photobook by Pentland et alii from MIT referring columns. Coarseness can be expressed by the to Wold Texture decomposition model following equation: 1 The general estimation process of computational 𝐶𝑠 𝑛 −1 𝑚 −1 𝑀𝑎𝑥𝑥 𝑖,𝑗 + 𝑚 −1 𝑛 −1 (6) 1/2𝑋( 𝑖=0 𝑗 =0 𝑗 =0 𝑖=0 𝑀𝑎𝑥𝑦 (𝑖,𝑗 )/𝑚 measures simulating human visual perception is as 𝑛 follows [3] . The denominator gives the number of maxima according to rows and columns. The more Step :1 The autocorrelation 𝑓(𝛿𝑖, 𝛿𝑗) is computed the number of maxima is high, the less the on Image 𝐼(𝑖, 𝑗). coarseness is and vice-versa. Step 2 : Then, the convolution of the autocorrelation function and the gradient of the Gaussian function is B. Contrast computed in a separable way Contrast measures the degree of clarity Step 3 : Based on these two functions, with which one can distinguish between different computational measures for each perceptual feature primitives in a texture. A well contrasted image is are computed an image in which primitives are clearly visible and separable. Local contrast is commonly defined for A.Coarseness each pixel as an estimate of the local variation in a Coarseness is the most important feature , neighborhood. More precisely, given a pixel p=(i,j) it is coarseness that determines the existence of and neighbor mask W X Wof the pixel, local texture in an image. Coarseness measures the size of contrast is computed as the primitives that constitute the texture. A coarse texture is composed of large primitives and is 𝑚𝑎𝑥𝑝 ∈𝑊𝑋𝑊 𝑃 − 𝑚𝑖𝑛𝑝 ∈𝑊𝑋𝑊 (𝑃) characterized by a high degree of local uniformity of 𝐿_𝐶𝑡 𝑖, 𝑗 = (7) 𝑚𝑎𝑥𝑝 ∈𝑊𝑋𝑊 𝑃 +𝑚𝑖𝑛𝑝 ∈𝑊𝑋𝑊 (𝑃) grey-levels. A fine texture is constituted by small 247 | P a g e
  3. 3. Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.246-249We propose to measure the global contrast as the orientation(s) 𝑑 . Let 𝜃𝑑 𝑖, 𝑗 = 1 if pixel(i,j) has aglobal arithmetic mean of all the local contrast dominant orientation 𝜃𝑑 and 𝜃𝑑 𝑖, 𝑗 = 0 if pixelvalues over the image: (i, j) does not have a dominant orientation.We consider only dominant orientations 𝜃𝑑 that are 1 𝑛 𝑚𝐺_𝐶𝑡 = 𝑖=1 𝑗 =𝑖 𝐿+ 𝐶𝑡(𝑖, 𝑗) (8) present in a sufficient number of pixels, and then 𝑚 ∗𝑛 more than a threshold t so that orientation becomeswhere n, m are the dimensions of the image. visible. Let us denote the number of non oriented pixels 𝑁𝜃𝑛𝑑 The degree of directionality 𝑁𝜃𝑑 of an image can be expressed by the following equation: 𝑛 −1 𝑚 −1 𝑖=0 𝑗 =0 𝜃𝑑 𝑖,𝑗 𝑁𝜃𝑑 = 𝑛𝑋𝑚 −𝑁𝜃𝑛𝑑 (10) The more NӨd is large, the more the image is directional. The more NӨd is small, the more the image is nondirectional. IV. Experimental ResultsFig. 5 The less and the most contrasted texture in the The Computational perceptual features areVis Tex database. used to retrieve the image. Figure 6 shows the window to select the query image. Among the In Figure 5 are shown the most and the available images the image isless contrasted textures in Vis Tex texture database[5] as respect to the proposed measureC. Directionality Regarding directionality,we want toestimate two parameters: the dominant orientation(s)and the degree of directionality. Orientation refers tothe global orientation of primitives that constitutethe texture. The degree of directionality is related tothe visibility of the dominant orientation(s) in animage, and refers to the number of pixels having thedominant orientation(s).Orientation Estimation: When considering theautocorrelation function, one can notice twophenomena concerning the orientation: Figure 6 : Window to select query image 1. existing orientation in the original image is savedin the corresponding autocorrelation function; retrieved .In the figure 7 we can see the total number 2. the usage of the autocorrelation function instead of retrieved images is Nine. Precision of the imageof the original image allows retrieved is calculated as total number of relaventkeeping the global orientation rather than the local images / total number of images i.e Precision = 6/9orientation when one uses the original image. =0.667 in this exampleIt follows that the global orientation of the imagecan be estimated by applying the gradient on theautocorrelation function of the original imageaccording to the lines Cx and according to thecolumns Cy . The orientation Ө is given by 𝐶𝑦 𝜃 = arctan⁡ 𝐶𝑥 ) ( (9)We consider only pixels with a significantorientation. A pixel is considered as oriented if itsamplitude is superior than a certain threshold t. Directionality Estimation:For directionality, we consider the number of pixels having dominant Figure 7 : Retrieved images 248 | P a g e
  4. 4. Prathusha.Perugu, Krishna veni / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.246-249 V. Conclusion An Innovative method is used to contentbased image retrieval using perceptual features oftexture. The precision is calculated and shownexperimentally that it is 0.6 to 0.8. And this methodof image retrieval is effective compared to existingmethods.References : 1. N. Abbadeni, D. Ziou, and S. Wang. Autocovariance based perceptual textural features corresponding to human visual perception. to appear in the Proceedingsof the 15th International Conference on PatternRecognition, Barcelona, Spain, September 3-8, 2000 2. P. Brodatz, Textures: A Photographic Album for Artists and Designers. 3. New York: Dover, 1966. 4. Noureddine Abbadeni Computational Perceptual Features for Texture Representation and Retrieval IEEE transactions on Image processing, Vol. 20, NO. 1, January 2011 5. Sara Sanam1, Sudha Madhuri.M ,Perception based Texture Classification,Representation and Retrieval , International Journal of Computer Trends and Technology- volume3 Issue1- 2012 6. VISTEX texture database: 7. http://www.white.media.mit.edu/vismod/ima gery/VisionTexture/ vistex.html 249 | P a g e

×