ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012 A HVS based Perceptual Quality Estimation Measure   ...
ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012observers. They are limited in approach to assess ove...
ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012D. Proposed Quality Estimation Measure               ...
ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012   Figure 2. Reference image simulated with different...
ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012             (a)                                     ...
ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012[8] W. Yang, L. Wu, Ye fan and Z. Wang, “A Method of ...
Upcoming SlideShare
Loading in …5
×

A HVS based Perceptual Quality Estimation Measure for Color Images

301 views

Published on

Human eyes are the best evaluation model for
assessing the image quality as they are the ultimate receivers
in numerous image processing applications. Mean squared
error (MSE) and peak signal-to-noise ratio (PSNR) are the
two most common full-reference measures for objective
assessment of the image quality. These are well known for
their computational simplicity and applicability for
optimization purposes, but somehow fail to correlate with the
Human Visual System (HVS) characteristics. In this paper a
novel HVS based perceptual quality estimation measure for
color images is proposed. The effect of error, structural
distortion and edge distortion have been taken in account in
order to determine the perceptual quality of the image
contaminated with various types of distortions like noises,
blurring, compression, contrast stretching and rotation.
Subjective evaluation using Difference Mean Opinion Score
(DMOS), is also performed for assessment of the perceived
image quality. As depicted by the correlation values, the
proposed quality estimation measure proves to be an efficient
HVS based quality index. The comparisons in results also
show better performance than conventional PSNR and
Structural Similarity (SSIM).

Published in: Technology, Art & Photos
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
301
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

A HVS based Perceptual Quality Estimation Measure for Color Images

  1. 1. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012 A HVS based Perceptual Quality Estimation Measure for Color Images Prateek Gupta1, Priyanka Srivastava1, Satyam Bhardwaj1 and Vikrant Bhateja1 1 Department of Electronics and Communication Engineering Shri Ramswaroop Memorial Group of Professional Colleges Lucknow-227105 (U.P), India Email: {enggprateek.gupta, priyanka.srivastava4989, er.satyam123, bhateja.vikrant}@gmail.comAbstract—Human eyes are the best evaluation model for they consider error sensitivity as the only parameter forassessing the image quality as they are the ultimate receivers assessment of image quality [3]. The conventional indexesin numerous image processing applications. Mean squared of quality assessment MSE and PSNR are operationally simpleerror (MSE) and peak signal-to-noise ratio (PSNR) are the and have clear physical meaning but are unable to assess thetwo most common full-reference measures for objective similarity between different distortion types [4]. Compatibilityassessment of the image quality. These are well known fortheir computational simplicity and applicability for of SSIM [5] with HVS characteristics accounts for its wideoptimization purposes, but somehow fail to correlate with the applicability, yet the performance of this metric degrade forHuman Visual System (HVS) characteristics. In this paper a poor quality and high texture images. Apart from itsnovel HVS based perceptual quality estimation measure for computational complexity (in comparison to MSE), SSIM iscolor images is proposed. The effect of error, structural also not efficient for relative translations, scaling and rotationdistortion and edge distortion have been taken in account in of images [4], [6]. Eric Wharton et al. [7] included Michelsonorder to determine the perceptual quality of the image contrast and MSE for making their proposed metrics morecontaminated with various types of distortions like noises, compatible to HVS. But their metrics does not account forblurring, compression, contrast stretching and rotation. structural and edge changes which are mainly noticed bySubjective evaluation using Difference Mean Opinion Score(DMOS), is also performed for assessment of the perceived human eyes. Wan Yang et al. [8] used a method of imageimage quality. As depicted by the correlation values, the quality assessment based on Region of Interest (ROI) andproposed quality estimation measure proves to be an efficient structural similarity. They had modified classical SSIM byHVS based quality index. The comparisons in results also using circular symmetric Gaussian weighting function andshow better performance than conventional PSNR and used gradient operator to enhance the performance of theirStructural Similarity (SSIM). metric. But their metric does not account for the edge modifications in the image which are mainly noticed by humanIndex Terms—PSNR, HVS, RGB color model, Quality Measure eyes. Wei Fu et al. proposed a similarity index [9] for color(Q), Subjective Evaluation, DMOS. images and used edge, luminance and structural similarity for the assessment of quality. However their model neglects I. INTRODUCTION the effects of other color image components like contrast Although the field of digital image processing is built on that give information about the visual quality of image. Bothe foundation of mathematical formulations, yet human Wang et al. in their quality index [10] employ the HVSintuition and analysis plays a central role in the choice of one characteristics both in frequency and spatial domain. Ho-technique over another [1]. The visual quality of a color image Sung Han et al. [11] used the gradient information for theis adjudged in the best way according to the look and feel of assessment of image quality. The concept used in their workthe image by the human visual system and hence it has has been considering only the effect of large differencesbecome a major platform for the quality evaluation of color between the pixels of the original and distorted images.images. Distortions are introduced in an image during the However, distortions projecting small differences in the pixelsprocess of storage, reconstruction, compression and are neglected. Recently, Chen Yutuo et al. proposed anenhancement. Therefore, assessment of quality is a crucial evaluation method for coding color images [12] based on thetask in image processing applications [2]. Objective evaluation HVS model. Image pixels, region construction and edges aremodels use the mathematical expressions for assessment of used as evaluation parameters in their work. The simulationsimage quality while the subjective models are based on performed by the authors have considered only the limitedphysiological and psychological perception of human. set of distortions, being silent about the effect of differentSubjective models, evaluate the quality of the image as noise types, rotation, blurring etc. Many objectiveperceived by an individual observer. The features of both evaluations proposed in the literature are at times provedthese models are incorporated in the HVS model that complex and could not compete over the conventional PSNRcorrelates well with the perceived image quality. The scope [13]-[15]. Although, it is true that images with higher valuesof conventional HVS based evaluation model is limited, as of PSNR do not often yield good visualization by human© 2012 ACEEE 63DOI: 01.IJSIP.03.01. 11
  2. 2. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012observers. They are limited in approach to assess overenhancements, blurring effects and weak edges which arethe primary outcomes of processing by sharp enhancementor denoising filters. This paper proposes a full-referenceperceptual quality estimation measure for color imagesincorporating the known characteristics of the HVS. Theproposed evaluation model provides an objective assessmentof the image by taking into account the error sensitivity,structural distortion and edge distortion. It uses the RGBmodel for color images and empirically combines the aboveeffects on each of the color plane. The remaining part of thispaper is structured as follows: Section II describes the methodfor estimation of degree of distortions and the proposedperceptual quality estimation measure. Details of the B.Measurement of Structural Distortionexperimental procedure and the obtained results are discussed Human eyes mainly extract the structural changes fromunder section III. Finally the conclusions are drawn at the the viewing field, so structural distortion is an importantend. parameter for quality assessment. Measurement of this distortion in the proposed work is performed by dividing the II. PROPOSED EVALUATION MODEL image in equal size PROPOSED QUALITY METRIC and non overlapping square regions, along with the calculation Image degradations can be modeled as losses in the of the mean, maximum and minimum pixel values in each region.perceived structural content in addition to the error sensitivity As it is known that the maximum and the minimum pixel valuesestimation (as in Minkowski error metric [16]). There are still decide the contrast level of the region, thus the proposednumerous structural changes which cannot be captured by measure also estimates the contrast change in the distortedthe above estimators; one of them is edge distortion, a image and hence can be used to evaluate the quality inprimary outcome of the images restored by sharp non-linear coherence with HVS. Structural distortion Sr, Sg and Sb for thefilters removing high density noise contaminations at the three color components R, G and B respectively can becost of weak edges and losses of finer details from the image. calculated as:The motivation for our work is to develop a more precise N 1evaluation framework for degraded images by modeling Sr  N   2     0.5  Xai  Yai   2 0.25  Xpi  Ypi  2 0.25  Xbi  Ybi  i 1  (3)distortion in terms of error sensitivity along with structuraland edge distortions for quality measurement. The proposed where: Xai, Xpi, Xbi and Yai, Ypi, Ybi denote the mean, maximumfull-reference image quality measure (Q), performs evaluation and minimum pixel values for the reference and the distortedas the weighted sum of the above mentioned distortions (i.e. image respectively. N is the number of regions in which theerror-sensitivity, structural and edge distortions) calculated image is divided. The overall structural distortion is the meanfor each of the R, G, B color planes separately. Figure 1 shows of the structural distortion of the three color componentsthe block diagram for the proposed methodology for and hence PSNRS (dB) of the structural distortion is givenestimation of distortion in color images. as:A. Measurement of Error  PSNRS  10 log10 3 /  S r  S g  S b   (4) The distortion introduced in the image changes the pixel C. Measurement of Edge Distortiongray value which is mainly noticed by the HVS [12]. Thus, Edges denote the sites where there is an abrupt changethe error introduced in the three color components is in the pixel value or texture. A distorted image with very similarcalculated separately for quality measurement. If x(i,j) denotes edges to the reference image gives a very high perceptualthe reference image and y(i,j), the distorted image, then the quality for HVS, although PSNR and SSIM produce anerror introduced in the R component is calculated as: opposite result [9]. Canny edge detection algorithm [17] is 2 1 M N used in this work as it is well-known for its large values of E r  MN   x(i, j )  y(i, j)   signal-to-noise ratio and high precision. The edge distortion i 1 j 1where: i & j are the pixel positions of the M x N image. Similarly, for R component is given as:Eg and Eb are the error in G and B components, calculated as 1 M N    x  i, j   y (i, j )  2 ED  r  (5)in (1). Overall error is estimated by average of the error for R, MN i 1 j 1 e eG and B components and hence, PSNRE (dB) due to error is where: xe(i,j) and ye(i,j) denotes the original and distortedgiven as: edge maps of the image. Similarly, EDg and EDb can be calcu- PSNRE  10 log10  3 /  E r  E g  Eb     (2) lated for G and B component. Hence the overall PSNRED (dB)   of the edge distortion can be given as:  PSNRED  10 log10   3 /  EDr  EDg  EDb   (6)    © 2012 ACEEE 64DOI: 01.IJSIP.03.01.11
  3. 3. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012D. Proposed Quality Estimation Measure Figure 2 shows the different simulated distortions applied The proposed quality measure (Q) in dB is the to the pre-processed input reference image. The color imageweighted sum of different PSNR (in dB) components based is then decomposed into its three components (R, G & B coloron error, structural and edge distortion obtained from (2), (4) planes) for further processing. The PSNR component for errorand (6) respectively. sensitivity, structural and edge distortions are then calculated for all the components of the image as described in (1)-(6). Q  PSNRE  PSNRED  PSNRS (7) Finally, the obtained results are empirically combined in thewhere: α, β, and γ coefficients have different values, empirically mathematical combiner to yield the quality measure (Q) in dBdetermined for different types of distortions. Introduction of as in (7)-(8), which specifically quantifies the effect of differentdifferent coefficients in (7) is being done as each type of types of simulated distortion. Table I gives the comparison ofdistortion has a different effect on error, structural and edge the obtained values of proposed measure (Q) in dB alongdistortion respectively. Thus, these variable weights make with the conventional PSNR (dB) for each category ofthe proposed quality measure a more sensitive and versatile distortions.towards specific type of distortions. In order to make the TABLE I. EVALUATION RESULTS OF CONVENTIONAL PSNR AND PROPOSED QUALITYproposed measure distortion specific, distortions are divided METRIC (Q).into 3 categories. Distortions specific to noise attacks aregrouped under category-I. Contrast distortions anddegradations due to rotation are considered under thecategory-II where as category-III groups the blurring andcompression artifacts. The quality measure (Q) in dB can bedefined as per the given matrix: Q I   0.25 0.5 0.25   PSNR E  Q    0.35 0.4 0.25   PSNRED   II     (8) Q  III   0.35   0.25 0.4   PSNR   Swhere: QI, QII and QIII are the quality measure in dB for thedistortion category-I, II and III respectively. In case ofcategory I, noises contaminations deform the edges moreseverely than the others factors in the image. So, a highercoefficient is assigned to the PSNR component of edgedistortion. The others two factors have an average affect onthe image. Category-II considers contrast change and rotationin the image that only modifies the pixel values withoutcausing any noticeable edge and structural distortion. Hence,a comparatively higher coefficient is assigned to the PSNRcomponent for error sensitivity unlike category I. Category-III includes blurring and compression, which causes largestructural modification in the image. This justifies for thechoice of the higher coefficients for the PSNR component oferror and structural distortion respectively. A generalizedversion of the proposed quality measure known as (Q0) in dBis given in (9), which is independent of the distortion type. Q0  0.32 PSNRE  PSNRED   PSNRS (9) III. EXPERIMENTAL RESULTS AND DISCUSSIONA. Experimental Procedure The input reference image used for the experiments inthis work is standard Lena image of size 256 x 256. As shownin figure 1, the early transformations (like normalization, grayconversion etc.) are applied to the input image in the pre-processing block. Different types of distortions are thensuperimposed on the pre-processed image; these includenoise contaminations (salt & pepper noise, speckle noiseand Gaussian noise), blurring (motion & Gaussian blur),contrast change, compression and rotation.© 2012 ACEEE 65DOI: 01.IJSIP.03.01.11
  4. 4. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012 Figure 2. Reference image simulated with different types of distortion. (a) - (c) Salt & Pepper noise. (d) - (f) Speckle noise. (g) - (j) Gaussian noise. (k) -(m) Contrast stretching. (n), (o) Rotation. (p) - (r) Gaussian blurring. (s) - (v) Motion blurring. (w), (x) Compression.B. Subjective Evaluation rescaling was accomplished by linearly mapping the range [- In this paper the efficiency of the objective quality measure 4, 5] to [0, 1] using:is adjudged by the subjective evaluation model. In the z  ( z  4) / 9 (14) ij ijprocess of subjective evaluation 42 images have beenassessed upon their visual quality by a group of 150 Finally the Difference Mean Opinion Score (DMOS) has beenobservers. These set of images have been distorted by calculated as the average of all the Z-scores for the respectivevarying levels of specified distortions. The quality scores image as:given by the observers were recorded on a quality scale of 0 1 Mto 1 (where 0 means worst and 1 means best). These subjective DMOS   z ij (15) M i 1scores were directly related to the quality of the image i.e. where: M is the total no of observers in the subjectivehigher the score better is the visual quality of the image. evaluation [19].C. Processing of Subjective Scores D. Comparison of Results Let sij denotes the subjective score assigned by the In order to evaluate the performance of the proposed qualityobserver i to the image j. In this paper the subjective measure (Q), Pearson correlation coefficient is used as a tool toevaluation is based on the DMOS i.e. difference mean opinion estimate the correlation of Q with respect to different HVS basedscore, which basically takes in account the difference between metrics like Edge Performance Index (EPI), Structural Correlationthe scores of reference and the test image. So, the difference (SC), Mean Absolute Error (MAE) [20], CONTRAST (c(x,y))score dij has been calculated by subtracting the subjective [21], and DMOS. The results in table I show that there is a largescore assigned by the observer to the test image from the difference in values of conventional PSNR as compared to Q,scores assigned to the reference image. for the images with approximately same visual quality. It can be d ij  sij  sij ref (10) inferred from the results stated in table II that Q is more efficient in measuring the degradation for all the distortion categories asThe difference scores that are zero in the case of reference per the HVS characteristics. From the results stated in table III, itimage have been removed for further processing. These can be seen that Q in addition of being a better HVS baseddifference scores are converted to the Z-scores [18] by using quality measure, also have the capability to assess specific typesthe following equations: of distortions like edge, structure, contrast changes and error in 1 N the image as the results have very high correlation coefficients.    d (11) i N j 1 ij All the stated results validate the effectiveness of proposed N quality measure in measuring the degradation in the image for 1 2 i   ( d  i ) (12) the various distortion types. The effectiveness of the quality N 1 j 1 ij measure Q can also be seen from the scatter plots shown in dij  i fig. 3, 4 and 5. In fig. 3(a), variation of the values of Q with the zij  (13) DMOS for noise contamination (category I) depicts that for i higher values of Q the DMOS score is low i.e. the DMOSWhere N is the number of test images seen by the observer. values decreases with the increase in the visual quality ofThese Z-scores are then linearly scaled between [0, 1] in the image because of lower noise contamination. The visualorder to develop a proper evaluation model. In our experi- quality of the image degrades with the increase in the degreement all the Z-scores lie within the range of [-4, 5]. Hence© 2012 ACEEE 66DOI: 01.IJSIP.03.01.11
  5. 5. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012 (a) (b) (c) Figure 3. Scatter plot of SSIM Vs DMOS for (a) Category I. (b) Category II. (c) Category III.of distortions like compression, blurring, contrast change of the image. Weighted coefficients used in the quality measureetc, which is also justified from the variation of Q shown in also strengthen its quality measurement ability. The correlationfig. 3 (b) and (c). For the same type and degree of distortion, coefficients calculated for the subjective evaluation results provethe values of PSNR and SSIM have also been plotted in fig. 4 the efficiency of proposed quality measure and its consistencyand 5 but their values does not follow any regular patter of with the HVS characteristics. The simulation results also showvariation with the subjective DMOS score showing their strong correlation with several other well known distortioninefficiency in evaluating the results as per the HVS measures like EPI, SC, MAE and CONTRAST.characteristics. TABLE II. PEARSON CORRELATION BETWEEN DIFFERENT QUALITY MEASURES AND REFERENCES SUBJECTIVE EVALUATION [1] R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 2nd eds., Prentice Hall, Upper Saddle River, New Jersey, 2001. [2] N. Ponomarenko, V. Lukin, K. Egiazarian, J. Astola, M. Carli, and F. Battisti, “Color Image Database for Evaluation of Image Quality Metrics,” IEEE Signal Process. Society 2008 Int. Workshop on Multimedia Signal Process., Australia, pp. 403-408, Sept 2008. [3] Z. Wang, A. C. Bovik, and L. Lu, “Why is image quality assessment so difficult,” in Proc. IEEE Int. Conf. Acoust., Speech and Signal Process., vol. 4, (Orlando), pp. 3313–3316, May 2002. CONCLUSION [4] Z. Wang and A. C. Bovik, “Mean Squared Error: Love It or Leave It?,” IEEE Signal Process. Magazine, pp. 98-117, Jan. 2009. This paper presents a new perceptual quality estimation [5] Z. Wang, L. Lu, and A.C. Bovik, “Video quality assessmentmodel for color images based on HVS. The proposed model based on structural distortion measurement,” Signal Process. Imageaccounts for all the types of degradations that severely affect Commun., vol. 19, no. 2, pp. 121–132, Feb. 2004.the visual quality of the image. It evaluates the quality measure [6] Z. Wang, A. C. Bovik, Hamid R. Sheikh and Eero P. Simoncelli,Q as weighted sum of PSNR components due to different “Image Quality Assessment: From Error Measurement to Structuraldistortions for each color planes. Different sets of weights have Similarity,” IEEE Trans. Image Process., Vol. 13, no. 4, pp. 600-been used according to the category of the distortion and their 612, April 2004. [7] E. Wharton, K. Panetta and S. Agatan, “Human Visual Systemvalues are based on the effect of the distortion on the visual Based Similarity metrics,” in Proc. of IEEE Conf. on System, Manquality of the image i.e. how it affects some of the common and Cybernetics, pp. 685-690, Oct. 2008.properties of the HVS like error, structural and edge information© 2012 ACEEE 67DOI: 01.IJSIP.03.01.11
  6. 6. ACEEE Int. J. on Signal & Image Processing, Vol. 03, No. 01, Jan 2012[8] W. Yang, L. Wu, Ye fan and Z. Wang, “A Method of Image [16] T. N. Pappas and R. J. Safranek, “Perceptual criteria for imageQuality Assessment Based on Region of Interest,” in Proc. of the quality evaluation,” in Handbook of Image and Video Proc. (A.7th World Congress on Intelligent Control and Autom., Chongqing, Bovik, ed.), Academic Press, 2000.China, pp. 6840-6843, June 2008. [17] J. Canny, “A computational approach to edge detection,” IEEE[9] Wei Fu, Xiaodong Gu*, Yuanyuan Wang , “Image Quality Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp.Assessment Using Edge and Contrast Similarity,” in Proc. of Int. 679–698, 1986.joint Conf. on Neural Networks, pp. 852 -855, 2008. [18] A. M. van Dijk, J. B. Martens, and A. B. Watson, “Quality[10] B. Wang, Z. Wang, Y. Liao and X. Lin, “ HVS-based structural assessment of coded images using numerical category scaling,” insimilarity for image quality assessment,” in Proc. of 9th Int. Conf. Proc. SPIE -Advanced Image and Video Commun. and Storageon Signal Process., Beijing, pp. 1194–1197, December 2008. Technologies, 1995.[11] Ho-Sung Han,Dong-O Kim and Rae-Hang Park, “ Gradient [19] K. Seshadrinathan, R. Soundararajan, A. C. Bovik and L. K.Information- Based Image Quality Metric,” Consumer Electronics Cormack, “Study of Subjective and Objective Quality Assessment(ICCE), 2010 Digest of Technical Papers, pp. 361-362, Feburary of Video,” IEEE Trans. Image Process., Vol. 19, no. 6, pp. 1427-2010. 1441, June 2010.[12] Chen Yutuo, Wu Meijie, Fei Yongchao, “Evaluation Method [20] Neha Tripathi, Mohd. Faizan Wahidi, Ashwani Gupta, andof Color Image Coding Quality integrating Visual Characteristics of Vikrant Bhateja, “A Novel Spatial Domain Image Quality Metric,”Human Eye,” in Proc. of 2nd Int. Conf. on Education Technology in Proc. of (IEEE) 2011 World Congress on Engg. & Technologyand Computer, Shanghai, China, Vol. 2, pp. 562 -566, June 2010. (CET-2011) Shanghai: Signal & Information Process., pp. 902-905[13] VQEG, “Final report from the video quality experts group on , Oct. 2011.the validation of objective models of video quality assessment,” [21] W.M. Morrow, R.B. Paranjape, R.M. Rangayyan and J.E.L.http://www.vqeg.org/, Mar. 2000. Desautels, “Region Based Contrast Enhancement of[14] J.-B. Martens and L. Meesters, “Image dissimilarity,” Signal Mammograms,” IEEE Trans. on Medical Imaging, vol. 11, no. 3,Processing, vol. 70, pp. 155–176, Nov. 1998. pp.392-406, 1992.[15] A. M. Eskicioglu and P. S. Fisher, “Image quality measuresand their performance,” IEEE Trans. Commun., vol. 43, pp. 2959–2965, Dec. 1995. TABLE III. PEARSON C ORRELATION BETWEEN DIFFERENT QUALITY MEASURES AND EPI , SC, MAE AND CONTRAST(C( X,Y))© 2012 ACEEE 68DOI: 01.IJSIP.03.01.11

×