Visual Image Quality Assessment Technique using FSIM

392 views
295 views

Published on

The goal of quality assessment (QA) research is to design algorithms that can automatically
assess the quality of images in a perceptually consistent manner. Image QA algorithms generally
interpret image quality as fidelity or similarity with a “reference” or “perfect” image in some perceptual
space. In order to improve the assessment accuracy of white noise, Gauss blur, JPEG2000 compression
and other distorted images, this paper puts forward an image quality assessment method based on phase
congruency and gradient magnitude. The experimental results show that the image quality assessment
method has a higher accuracy than traditional method and it can accurately reflect the image visual
perception of the human eye. In this paper, we propose an image information measure that quantifies the
information that is present in the reference image and how much of this reference information can be
extracted from the distorted image.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
392
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Visual Image Quality Assessment Technique using FSIM

  1. 1. International Journal of Computer Applications Technology and ResearchVolume 2– Issue 3, 250 - 254, 2013www.ijcat.com 250Visual Image Quality AssessmentTechnique using FSIMRohit KumarCsvtu bhilaiSscet bhilaiIndiaVishal MoyalCsvtu bhilaiSscet bhilaiIndiaAbstract: The goal of quality assessment (QA) research is to design algorithms that can automaticallyassess the quality of images in a perceptually consistent manner. Image QA algorithms generallyinterpret image quality as fidelity or similarity with a “reference” or “perfect” image in some perceptualspace. In order to improve the assessment accuracy of white noise, Gauss blur, JPEG2000 compressionand other distorted images, this paper puts forward an image quality assessment method based on phasecongruency and gradient magnitude. The experimental results show that the image quality assessmentmethod has a higher accuracy than traditional method and it can accurately reflect the image visualperception of the human eye. In this paper, we propose an image information measure that quantifies theinformation that is present in the reference image and how much of this reference information can beextracted from the distorted image.Keywords: Image quality assessment (IQA), structural similarity index (SSIM), feature similarityindex (FSIM),phase congruency (PC), gradient magnitude(GM),low level feature1.INTRODUCTIONImage quality assessment is an important studytopic in the image processing area. Image qualityis a fundamental characteristic of any imagewhich measures the perceived image degradation.Generally, compared with an ideal or perfectimage. Digital images are subject to a widevariety of distortions during acquisition,processing, compression, storage, transmissionand reproduction, any of which may result in adegradation of visual quality. Imaging systemsintroduces some amount of distortion or artifactswhich reduces the quality assessment and here itis our point of interest.By defining image quality in terms of a deviationfrom the ideal situation, quality measuresbecome technical in the sense that they can beimpartially determined in terms of deviationsfrom the ideal models.Generally speaking, visual quality assessmentcan be divided into two categories one issubjective visual quality assessment and anotherone is objective visual quality assessment.Subjective quality assessment is done by humanswhich represents the realistic opinion towards anImage. Image quality objective assessment usesthe mathematical model to quantitative theassessment index and simulates human visualperception system to assess the image quality.Common image quality objective assessmentindexes include PSNR (Peak Signal to NoiseRatio), SSIM (Structural Similarity), and FSIM(Feature-Similarity). Based upon the Availabilityof Reference Objective quality assessment isclassified as no reference (NR),reducedreference(RR),full reference(FR)[1]methods. Ifthere is no reference signal available for thedistorted (test) one to compare with, then aquality evaluation method is termed as a No-reference (NR).If the information of thereference medium is partially available, e.g., inthe form of a set of extracted features, then this isthe so-called Reduced-Reference (RR) method.FR method needs the complete referencemedium to assess the distorted medium. Since ithas the full information about original medium,
  2. 2. International Journal of Computer Applications Technology aVolume 2– Issue 3, 2www.ijcat.comit is expected to have the best quality predictionperformance. Most existing quality assessmentschemes belong to this category.The following are implemented for image qualityassessment algorithms as Mean square error(MSE), peak signal to noise (PSNR), structuralsimilarity index (SSIM), feature similarity index(FSIM). Mean Squared Error is the averagesquared difference between a reference imageand a distorted image. It is computed pixel-bypixel by adding up the squared differences of allthe pixels and dividing by the total pixel count.Peak Signal-to-Noise Ratio is the ratio betweenthe reference signal and the distortion signal inan image, given in decibels. The higher thePSNR, the closer the distorted image is to theoriginal SNR(Peak signal to noise).The simplestand most widely used image quality metrics areMSE and PSNR since they are easy to calculateand are also mathematically convenient in thecontext of optimization. However, they oftencorrelate poorly with subjective visual quality.PSNR has always been criticized its poorcorrelation with human subjective evaluations.However according to our observations , PSNRsometimes still can work very well on somespecific distortion types, such as additive andquantization noise. The structural similarity(SSIM) index is a method for measuring thesimilarity between two images. SSIM attempts tomeasure the change in luminance, contrast, andstructure in an image [2-4]. The multi-scaleextension of SSIM, called MS-SSIM [5],produces better results than its single-scalecounterpart. Recent studies conducted in [6] and[7] have demonstrated that SSIM, MS-SSIM,and VIF could offer statistically much betterperformance in predicting images’ fidelity thanthe other IQA metrics. However, SSIM and MSSSIM share a common deficiency that whenpooling a single quality score from the localquality map (or the local distortion measurementmap), allpositions are considered to have the sameimportance. Feature similarity indexingmaintains IQA (image quality assurance) basedon the fact that human visual system (HVS)understands an image mainly according to itslow-level features. The main feature of FSIMphase congruency which is a dimensionlessmeasure of a local structure. Actually, PC hasInternational Journal of Computer Applications Technology and ResearchIssue 3, 250 - 254, 2013251it is expected to have the best quality predictioning quality assessmentThe following are implemented for image qualityas Mean square errorstructuralfeature similarity indexMean Squared Error is the averagesquared difference between a reference imageby-pixel by adding up the squared differences of allthe pixels and dividing by the total pixel count.the ratio betweenthe reference signal and the distortion signal inan image, given in decibels. The higher thePSNR, the closer the distorted image is to theThe simplestand most widely used image quality metrics areE and PSNR since they are easy to calculateand are also mathematically convenient in thecontext of optimization. However, they oftencorrelate poorly with subjective visual quality.PSNR has always been criticized its poorive evaluations.However according to our observations , PSNRsometimes still can work very well on somesuch as additive andstructural similarity(SSIM) index is a method for measuring theetween two images. SSIM attempts tomeasure the change in luminance, contrast, andscaleSSIM [5],scale] andSSIM,and VIF could offer statistically much betterperformance in predicting images’ fidelity thanSSIM and MS-SSIM share a common deficiency that whene from the localquality map (or the local distortion measurementpositions are considered to have the sameFeature similarity indexingmaintains IQA (image quality assurance) basedon the fact that human visual system (HVS)rding to itseature of FSIM isphase congruency which is a dimensionlessActually, PC hasalready been used for IQA in the literature. In[8], Liu and Laganière proposed a PC-based IQAmetric. In their method, PC maps are partitionedinto sub-blocks of size 5×5. Then, thecrosscorrelation is used to measure the similaritybetween two corresponding PC sub-blocks. Theoverall similarity score is obtained by averagingthe cross correlation values from all block pairs.In [9], PC was extended to phase coherencewhich can be used to characterize the image blur.Based on [8], Hassen et al. proposed an NR IQAmetric to assess the sharpness of an input image[10].Due to phase congruency the contrast of theimage will affect Human visual system but thesecondary feature of FSIM which is gradientmagnitude control perception of image quality.Phase congruency and Gradient Magnitude playcomplementary roles in characterizing the imagelocal quality and derive a single quality score.2. DESIGN METHODOLOGYFig. 1: Block diagram of proposed method3.FEATURE SIMILARITYINDEXINGIn this work, a full reference IQA is proposedbased on the fact that human visual system(HVS) understands an image mainly accordingto its low-level features. The primary feature forthis is the phase congruency (PC), which is a
  3. 3. International Journal of Computer Applications Technology and ResearchVolume 2– Issue 3, 250 - 254, 2013www.ijcat.com 252dimensionless measure of the significance of alocal structure, the image gradient magnitude(GM) is employed as the secondary feature . PCand GM play complementary roles incharacterizing the image local quality and derivea single quality score.3.1 Phase Congruency (PC)Phase congruency is a new method for detectingfeatures in images. One of its significantstrengths is its invariance to lighting variationwithin an image, as well as being able to detect awide range of interesting features. We present amethod for estimating the phase congruency oflocalized frequencies that cannot be measuredseparately by Gabor filters. We show that bymeasuring the ratio of the standard deviation tothe mean energy between different phase shiftedGabor filters that we are able to estimate whetherthe localised frequencies are phase congruent.Phase congruency reflects the behaviour of theimage in the frequency domain. PC is contrastinvariant while the contrast information doesaffect HVS’ perception of image quality. Wedescribes a new corner and edge detector [12-14]developed from the phase congruency model offeature detection[11].Fig. 2: (a) Original image (b) localizedfrequency based phase congruencyIn this paper we adopt the method developedfrom the 1D signal I(x). Here ‫ܯ‬௡௘and the ‫ܯ‬௡ைarethe even-symmetric and odd-symmetric filters onscale n and they form a quadrature pair. Thesignal will form a response vector at position xon scale n: [݁௡(x), ‫݋‬௡(x)] = [I(x)* ‫ܯ‬௡௘ , I(x)*‫ܯ‬௡௢], and the local amplitude on scale n is An(x) =( ) ( )22xoxe nn +Let V(x) = ∑n en(x) and H(x) = ∑n on(x) …(1)Then phase congruency is given byPC(x) =( )( )xAxEnn∑+∈Where E(x) = ( ) )(22xHxV +and ε is a small positive constant. We adopt thelog-Gabor filters because:1) Log-Gabor filters can be constructed witharbitrary bandwidth and the bandwidth can beoptimised to produce a filter with minimalspatial extent.2) log-Gabor functions, by definition, alwayshave no DC component.3) The transfer function of the log-Gabor filterhas an extended tail at the high frequency end,which makes it more capable to encode naturalimages than ordinary.3.2 Gradient magnitude (GM)Image gradient computation is a traditional topicin image processing. Gradient operators can beexpressed by convolution masks. Threecommonly used gradient operators are the Sobeloperator, the Prewitt operator and the Scharr-operator. Their performances will be examinedin the section of experimental results. The partialderivatives Gx(y) and Gy(y) of the image f(y)along horizontal and vertical directions using thethree gradient operators are used. The gradientmagnitude (GM) of f(y) is then defined asට‫ܩ‬௫ଶ ൅ ‫ܩ‬௬ଶ3.3 FSIM algorithmWith the extracted PC and GM feature maps, inthis section we present a novel Feature Similarity(FSIM) index for IQA. Suppose that we aregoing to calculate the similarity between imagesf1 (test image) and f2 (original image) denote by
  4. 4. International Journal of Computer Applications Technology and ResearchVolume 2– Issue 3, 250 - 254, 2013www.ijcat.com 253PC1 and PC2. The Phase Congruency mapsextracted from f1(x) and f2(x), and G1(x) andG2(x) the Gradient Magnitude maps extractedfrom them. It should be noted that for colorimages, Phase Congruency and Gradient mapfeatures are extracted from their luminancechannels. FSIM will be defined and computationbased on PC1(x), PC2(x), G1(x) and G2(x).Furthermore, by incorporating the imagechrominance information into FSIM, an IQAindex for color images or gray scale image,denoted by FSIMC, will be obtained.First we calculate the score of PC1(x) andPC2(x) and the similarity measure is defined asSPC (x) =ଶ୔େభሺ୶ሻ.୔େమሺ୶ሻ ା ்భ୔େభమ ሺ୶ሻ ା ୔େమమ ሺ୶ሻ ା ୘భ……….(2)Where Tଵ is a positive constant to increase thestability of SPC. In practice, T1 can be determinedbased on the dynamic range of PC values.Equation shows commonly used measure todefine the similarity of two positive real numbersand its result ranges within (0, 1].Similarly, the GM values G1(x) and G2(x) arecompared and the similarity measure is definedasSG(x) =ଶୋభሺ୶ሻ.ୋమሺ୶ሻ ା ୘మୋభమሺ୶ሻ ା ୋమమሺ୶ሻ ା ୘మ………….(3)Where T2 is a positive constant depending on thedynamic range of GM values. In ourexperiments, both T1 and T2 will be fixed to alldatabases so that the proposed FSIM can beconveniently used. Then, SPC(x) and SG(x) arecombined to get the similarity SL(x) of f1(x) andf2(x). We define SL(x) asSL(x) = [SPC(x)]α. [SG(x)]β………….(4)Where α and β are parameters used to adjust therelative importance of PC and GM features. Inthis paper, we set α = β =1 for simplicity.Thus SL(x) = [SPC(x)].[SG(x)].4. EXPERIMENTAL RESULTSAND DISCUSSIONA. Databases and methods forcomparisonThere are some publicly available imagedatabases in the IQA community, includingTID2008, CSIQ, LIVE and A57. All of themwill be used here for algorithm validation andcomparison. The performance of the proposedFSIM and FSIMC indices will be evaluated andcompared with four representative IQA metrics.Four commonly used performance metrics areemployed to evaluate the competing IQAmetrics.B. Determination of parametersThere are several parameters need to bedetermined for FSIM and FSIMC. To this end,we tuned the parameters based on a sub-data setof TID2008 database, which contains the first 8reference images in TID2008 and the associated544 distorted images.C. Gradient operator selectionTable 1: SROCC values using three gradientoperatorsDatabase SROCCSobel 0.8797Prewitt 0.8776Scharr 0.8825In our proposed IQA metrics FSIM/FSIMC, thegradient magnitude (GM) needs to be calculated.To this end, three commonly used gradientoperators were examined, and the one providingthe best result was selected. Such a gradientoperator selection process was carried out byassuming that all the parameters discussedearlier. The selection criterion was also that thegradient operator leading to a higher SROCCwould be selected.5. CONCLUSIONIn this paper, we proposed a novel efficient andeffective IQA index, FSIM, based on a specificvisual saliency model. FSIM is designed basedon the assumption that an image’s visualsaliency map has a close relationship with itsperceptual quality. Experimental results indicatethat FSIM could yield statistically betterprediction performance than all the othercompeting methods evaluated. Thus, FSIM canbe the best candidate of IQA indices for real timeapplications.6. REFERENCES[1].Z. Wang, A.C. Bovik, H.R. Sheikh, and E.P.Simoncelli,“Image quality assessment: fromerror visibility to structural similarity,”IEEE Trans. IP, vol. 13, pp. 600-612, 2004.
  5. 5. International Journal of Computer Applications Technology and ResearchVolume 2– Issue 3, 250 - 254, 2013www.ijcat.com 254[2].N. Damera-Venkata, T.D. Kite, W.S. Geisler,B.L. Evans, and A.C. Bovik, “Imagequality assessment based on a degradationmodel”, IEEE Trans. Image Process., vol. 9,no. 4, pp. 636-650, Apr. 2000.[3] D.M. Chandler and S.S. Hemami, “VSNR: awavelet-based visual signal-to-noise ratiofor natural images”, IEEE Trans. ImageProcess., vol. 16, no. 9, pp. 2284-2298, Sep.2007.[4] H.R. Sheikh and A.C. Bovik, “Imageinformation and visual quality”, IEEETrans. Image Process., vol. 15, no. 2, pp.430-444, Feb. 2006.[5] Z. Wang, E.P. Simoncelli, and A.C. Bovik,“Multi-scale structural similarity for imagequality assessment”, presented at the IEEEAsilomar Conf. Signals, Systems andComputers, Nov. 2003.[6] H.R. Sheikh, M.F. Sabir, and A.C. Bovik,“A statistical evaluation of recent fullreference image quality assessmentalgorithms”, IEEE Trans. Image Process.,vol. 15, no. 11, pp. 3440-3451, Nov. 2006.[7] N. Ponomarenko, V. Lukin, A. Zelensky, K.Egiazarian, M. Carli, and F. Battisti,“TID2008 - A database for evaluation offull-reference visual quality assessmentmetrics”, Advances of ModernRadioelectronics, vol. 10, pp.30-45, 2009.][8] Z. Liu and R. Laganière, “Phasecongruence measurement for imagesimilarity assessment”, Pattern Recognit.Letters, vol. 28, no. 1, pp. 166-172, Jan.2007.[9] Z. Wang and E.P. Simoncelli, “Local phasecoherence and the perception of blur”, inAdv. Neural Information ProcessingSystems., 2004, pp. 786-792.[10] R. Hassen, Z. Wang, and M. Salama, “Noreference image sharpness assessment basedon local phase coherence measurement”, inProc. IEEE Int. Conf. Acoust., Speech, andSignal Processing, 2010, pp. 2434-2437[11] P. Kovesi, “Image features from phasecongruency”, Videre: J. Comp. Vis. Res.,vol. 1, no.3, pp. 1-26, 1999.[12] D. Marr and E. Hildreth, “Theory of edgedetection”, Proc. R. Soc. Lond. B, vol. 207,no. 1167, pp. 187-217, Feb.1980.[13] M.C. Morrone and D.C. Burr, “Featuredetection in human vision: a phase-dependent energy model”, Proc. R.Soc.Lond. B, vol. 235, no. 1280, pp. 221-245, Dec. 1988.[14] M.C. Morrone and R.A. Owens, “Featuredetection from local energy”, PatternRecognit.Letters, vol. 6, no. 5, pp.303-313,Dec. 1987.

×