Detection of breast cancer in its early stage is very
important in the field of medicine. Optimal Contrast
Enhancement is essential for the detection of mass and micro
calcification in mammogram images. The standard histogram
equalization is effective and simple method for contrast
enhancement but for medical images most of the time it
produces excessive contrast enhancement due to lack of control
for the level of enhancement. In this paper image
enhancement is considered as an optimization problem and
an optimization technique based on entropy and edge
information of the image is presented. The enhancement
function used in the paper is Contrast Limited Adaptive
Histogram Equalization (CLAHE) based on local contrast
modification (LCM). Its enhancement potential is tested by
sobel operator for the detection of microcalcification. Results
are compared with other enhancement techniques such as
Histogram Equalization, Unsharp Masking and CLAHE.
Feature Extraction of an Image by Using Adaptive Filtering and Morpological S...IOSR Journals
Abstract: For enhancing an image various enhancement schemes are used which includes gray scale manipulation, filtering and Histogram Equalization, Where Histogram equalization is one of the well known image enhancement technique. It became a popular technique for contrast enhancement because it is simple and effective. The basic idea of Histogram Equalization method is to remap the gray levels of an image. Here using morphological segmentation we can get the segmented image. Morphological reconstruction is used to segment the image. Comparative analysis of different enhancement and segmentation will be carried out. This comparison will be done on the basis of subjective and objective parameters. Subjective parameter is visual quality and objective parameters are Area, Perimeter, Min and Max intensity, Avg Voxel Intensity, Std Dev of Intensity, Eccentricity, Coefficient of skewness, Coefficient of Kurtosis, Median intensity, Mode intensity. Keywords: Histogram Equalization, Segmentation, Morphological Reconstruction .
PURPOSES: this study aims to perform microcalsification detection by performing image enhancement in mammography image by using transformation of negative image and histogram equalization. METHOD: image mammography with .pgm format changed to. jpg format then processed into negative image result then processed again using histogram equalization. RESULT: the results of the image enhancement process using negative image techniques and equalization histograms are compared and validated with MSE and PSNR on each mammographic image. CONCLUSION: Image enhancement process on mammography image can be done, however there are only some image that have improved quality, this affected by threshold usage, which have important role to get better visualization on mammographic image.
Modified Contrast Enhancement using Laplacian and Gaussians Fusion Techniqueiosrjce
The aim of image fusion is to mix images of a scene captured below totally different illumination. One
image contains most of information from the whole supply images automatically. Contrast enhancement is employed
to enhance the standard of visible image with none introducing unrealistic visual appearances. Fusion technique is
employed for the important applications like medical imaging, microscopic imaging, remote sensing, and laptop
vision and robotics. Contrast enhancement improves the brightness differences within the dark, gray or bright regions
at the expense of the brightness differences within the alternative regions. During this paper methodology of the
contrast enhancement for images that improves the local image contrast by controlling the local image gradient. The
proposed methodology improves the improvement drawback and maximizes the local contrast and global contrast of
an image.
CONTRAST ENHANCEMENT AND BRIGHTNESS PRESERVATION USING MULTIDECOMPOSITION HIS...sipij
Histogram Equalization (HE) has been an essential addition to the Image Enhancement world.
Enhancement techniques like Classical Histogram Equalization(CHE),Adaptive Histogram Equalization
(AHE), Bi-Histogram Equalization (BHE) and Recursive Mean Separate Histogram Equalization (RMSHE)
methods enhance contrast, brightness is not well preserved, which gives an unpleasant look to the final
image obtained. Thus, we introduce a novel technique Multi-Decomposition Histogram Equalization
(MDHE) to eliminate the drawbacks of the earlier methods. In MDHE, we have decomposed the input
image using a unique logic, applied CHE in each of the sub-images and then finally interpolated them in
correct order. The final image after MDHE gives us the best results based on contrast enhancement and
brightness preservation aspect compared to all other techniques mentioned above. We have calculated the
various parameters like PSNR, SNR, RMSE, MSE, etc. for every technique. Our results are well supported
by bar graphs, histograms and the parameter calculations at the end.
A MODIFIED HISTOGRAM BASED FAST ENHANCEMENT ALGORITHMcsandit
The contrast enhancement of medical images has an important role in diseases diagnostic,
specially, cancer cases. Histogram equalization is considered as the most popular algorithm for
contrast enhancement according to its effectiveness and simplicity. In this paper, we present a
modified version of the Histogram Based Fast Enhancement Algorithm. This algorithm
enhances the areas of interest with less complexity. It is applied only to CT head images and its
idea based on treating with the soft tissues and ignoring other details in the image. The
proposed modification make the algorithm is valid for most CT image types with enhanced
results.
Review on Image Enhancement in Spatial Domainidescitation
With the proliferation in electronic imaging devices
like in mobiles, computer vision, medical field and space field;
image enhancement field has become the quite interesting
and important area of research. These imaging devices are
viewed under a diverse range of viewing conditions and a huge
loss in contrast under bright outdoor viewing conditions; thus
viewing condition parameters such as surround effects,
correlated color temperature and ambient lighting have
become of significant importance. Therefore, Principle
objective of Image enhancement is to adjust the quality of an
image for better human visual perception. Appropriate choice
of enhancement techniques is greatly influenced by the
imaging modality, task at hand and viewing conditions.
Basically, image enhancement techniques have been classified
into two broad categories: Spatial domain image enhancement
and Frequency domain image enhancement. This survey report
gives an overview of different methodologies have been used
for enhancement under the spatial domain category. It is noted
that in this field still more research is to be done.
A study of a modified histogram based fast enhancement algorithm (mhbfe)sipij
Image enhancement is one of the most important issues in low-level image processing. The goal of image
enhancement is to improve the quality of an image such that enhanced image is better than the original
image. Conventional Histogram equalization (HE) is one of the most algorithms used in the contrast
enhancement of medical images, this due to its simplicity and effectiveness. However, it causes the
unnatural look and visual artefacts, where it tends to change the brightness of an images. The Histogram
Based Fast Enhancement Algorithm (HBFE) tries to enhance the CT head images, where it improves the
water-washed effect caused by conventional histogram equalization algorithms with less complexity. It
depends on using full gray levels to enhance the soft tissues ignoring other image details. We present a
modification of this algorithm to be valid for most CT image types with keeping the degree of simplicity.
Experimental results show that The Modified Histogram Based Fast Enhancement Algorithm (MHBFE)
enhances the results in term of PSNR, AMBE and entropy. We use also the Statistical analysis to ensure
the improvement of the proposed modification that can be generalized. ANalysis Of VAriance (ANOVA) is
used as first to test whether or not all the results have the same average. Then we find the significant
improvement of the modification.
Feature Extraction of an Image by Using Adaptive Filtering and Morpological S...IOSR Journals
Abstract: For enhancing an image various enhancement schemes are used which includes gray scale manipulation, filtering and Histogram Equalization, Where Histogram equalization is one of the well known image enhancement technique. It became a popular technique for contrast enhancement because it is simple and effective. The basic idea of Histogram Equalization method is to remap the gray levels of an image. Here using morphological segmentation we can get the segmented image. Morphological reconstruction is used to segment the image. Comparative analysis of different enhancement and segmentation will be carried out. This comparison will be done on the basis of subjective and objective parameters. Subjective parameter is visual quality and objective parameters are Area, Perimeter, Min and Max intensity, Avg Voxel Intensity, Std Dev of Intensity, Eccentricity, Coefficient of skewness, Coefficient of Kurtosis, Median intensity, Mode intensity. Keywords: Histogram Equalization, Segmentation, Morphological Reconstruction .
PURPOSES: this study aims to perform microcalsification detection by performing image enhancement in mammography image by using transformation of negative image and histogram equalization. METHOD: image mammography with .pgm format changed to. jpg format then processed into negative image result then processed again using histogram equalization. RESULT: the results of the image enhancement process using negative image techniques and equalization histograms are compared and validated with MSE and PSNR on each mammographic image. CONCLUSION: Image enhancement process on mammography image can be done, however there are only some image that have improved quality, this affected by threshold usage, which have important role to get better visualization on mammographic image.
Modified Contrast Enhancement using Laplacian and Gaussians Fusion Techniqueiosrjce
The aim of image fusion is to mix images of a scene captured below totally different illumination. One
image contains most of information from the whole supply images automatically. Contrast enhancement is employed
to enhance the standard of visible image with none introducing unrealistic visual appearances. Fusion technique is
employed for the important applications like medical imaging, microscopic imaging, remote sensing, and laptop
vision and robotics. Contrast enhancement improves the brightness differences within the dark, gray or bright regions
at the expense of the brightness differences within the alternative regions. During this paper methodology of the
contrast enhancement for images that improves the local image contrast by controlling the local image gradient. The
proposed methodology improves the improvement drawback and maximizes the local contrast and global contrast of
an image.
CONTRAST ENHANCEMENT AND BRIGHTNESS PRESERVATION USING MULTIDECOMPOSITION HIS...sipij
Histogram Equalization (HE) has been an essential addition to the Image Enhancement world.
Enhancement techniques like Classical Histogram Equalization(CHE),Adaptive Histogram Equalization
(AHE), Bi-Histogram Equalization (BHE) and Recursive Mean Separate Histogram Equalization (RMSHE)
methods enhance contrast, brightness is not well preserved, which gives an unpleasant look to the final
image obtained. Thus, we introduce a novel technique Multi-Decomposition Histogram Equalization
(MDHE) to eliminate the drawbacks of the earlier methods. In MDHE, we have decomposed the input
image using a unique logic, applied CHE in each of the sub-images and then finally interpolated them in
correct order. The final image after MDHE gives us the best results based on contrast enhancement and
brightness preservation aspect compared to all other techniques mentioned above. We have calculated the
various parameters like PSNR, SNR, RMSE, MSE, etc. for every technique. Our results are well supported
by bar graphs, histograms and the parameter calculations at the end.
A MODIFIED HISTOGRAM BASED FAST ENHANCEMENT ALGORITHMcsandit
The contrast enhancement of medical images has an important role in diseases diagnostic,
specially, cancer cases. Histogram equalization is considered as the most popular algorithm for
contrast enhancement according to its effectiveness and simplicity. In this paper, we present a
modified version of the Histogram Based Fast Enhancement Algorithm. This algorithm
enhances the areas of interest with less complexity. It is applied only to CT head images and its
idea based on treating with the soft tissues and ignoring other details in the image. The
proposed modification make the algorithm is valid for most CT image types with enhanced
results.
Review on Image Enhancement in Spatial Domainidescitation
With the proliferation in electronic imaging devices
like in mobiles, computer vision, medical field and space field;
image enhancement field has become the quite interesting
and important area of research. These imaging devices are
viewed under a diverse range of viewing conditions and a huge
loss in contrast under bright outdoor viewing conditions; thus
viewing condition parameters such as surround effects,
correlated color temperature and ambient lighting have
become of significant importance. Therefore, Principle
objective of Image enhancement is to adjust the quality of an
image for better human visual perception. Appropriate choice
of enhancement techniques is greatly influenced by the
imaging modality, task at hand and viewing conditions.
Basically, image enhancement techniques have been classified
into two broad categories: Spatial domain image enhancement
and Frequency domain image enhancement. This survey report
gives an overview of different methodologies have been used
for enhancement under the spatial domain category. It is noted
that in this field still more research is to be done.
A study of a modified histogram based fast enhancement algorithm (mhbfe)sipij
Image enhancement is one of the most important issues in low-level image processing. The goal of image
enhancement is to improve the quality of an image such that enhanced image is better than the original
image. Conventional Histogram equalization (HE) is one of the most algorithms used in the contrast
enhancement of medical images, this due to its simplicity and effectiveness. However, it causes the
unnatural look and visual artefacts, where it tends to change the brightness of an images. The Histogram
Based Fast Enhancement Algorithm (HBFE) tries to enhance the CT head images, where it improves the
water-washed effect caused by conventional histogram equalization algorithms with less complexity. It
depends on using full gray levels to enhance the soft tissues ignoring other image details. We present a
modification of this algorithm to be valid for most CT image types with keeping the degree of simplicity.
Experimental results show that The Modified Histogram Based Fast Enhancement Algorithm (MHBFE)
enhances the results in term of PSNR, AMBE and entropy. We use also the Statistical analysis to ensure
the improvement of the proposed modification that can be generalized. ANalysis Of VAriance (ANOVA) is
used as first to test whether or not all the results have the same average. Then we find the significant
improvement of the modification.
Optimal Coefficient Selection For Medical Image FusionIJERA Editor
Medical image fusion is one of the major research fields in image processing. Medical imaging has become a
vital component in major clinical applications such as detection/ diagnosis and treatment. Joint analysis of
medical data collected from same patient using different modalities is required in many clinical applications.
This paper introduces an optimal fusion technique for multiscale-decomposition based fusion of medical images
and measuring its performance with existing fusion techniques. This approach incorporates genetic algorithm
for optimal coefficient selection and employ various multiscale filters for noise removal. Experiments
demonstrate that proposed fusion technique generate better results than existing rules. The performance of
proposed system is found to be superior to existing schemes used in this literature.
Enhancement of Medical Images using Histogram Based Hybrid TechniqueINFOGAIN PUBLICATION
Digital Image Processing is very important area of research. A number of techniques are available for image enhancement of gray scale images as well as color images. They work very efficiently for enhancement of the gray scale as well as color images. Important techniques namely Histogram Equalization, BBHE, RSWHE, RSWHE (recursion=2, gamma=No), AGCWD (Recursion=0, gamma=0) have been used quite frequently for image enhancement. But there are some shortcomings of the present techniques. The major shortcoming is that while enhancement, the brightness of the image deteriorates quite a lot. So there was need for some technique for image enhancement so that while enhancement was done, the brightness of the images does not go down. To remove this shortcoming, a new hybrid technique namely RESWHE+AGCWD (recursion=2, gamma=0 or 1) was proposed. The results of the proposed technique were compared with the existing techniques. In the present methodology, the brightness did not decrease during image enhancement. So the results and the technique was validated and accepted. The parameters via PSNR, MSE, AMBE etc. are taken for performance evaluation and validation of the proposed technique against the existing techniques which results in better outperform.
7 ijaems sept-2015-8-design and implementation of fuzzy logic based image fus...INFOGAIN PUBLICATION
The quality of image holds importance for both humans and machines. To fulfill the requirement of good quality images, image enhancement is needed. Application of a single contrast enhancement technique often does not produce desirable result and may lead to over enhanced images. To overcome this problem image fusion is performed so that better results with desired enhancement can be achieved. In the present paper an amalgamation of image enhancement, fusion and sharpening have been carried out in the candidate algorithm. The algorithm makes use of fuzzy logic for weight calculation. The results are compared with DACE/LIF approach and it is observed that the proposed algorithm improves the result in terms of quality parameters like PSNR (Peak Signal to Noise Ratio), AMBE (Absolute Mean Brightness Error) and SSIM (Structural Similarity Index) by 0.5 dB, 3 and 0.1 respectively from the existing technique.
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...ijcseit
A novel Modified Histogram Equalization (MHE) technique for contrast enhancement is proposed in this
paper. This technique modifies the probability density function of an image by introducing constraints prior
to the process of histogram equalization (HE). These constraints are formulated using two parameters
which are optimized using swarm intelligence. This technique of contrast enhancement takes control over
the effect of HE so that it enhances the image without causing any loss to its details. A median adjustment
factor is then added to the result to normalize the change in the luminance level after enhancement. This
factor suppresses the effect of luminance change due to the presence of outlier pixels. The outlier pixels of
highly deviated intensities have greater impact in changing the contrast of an image. This approach
provides a convenient and effective way to control the enhancement process, while being adaptive to
various types of images. Experimental results show that the proposed technique gives better results in
terms of Discrete Entropy and SSIM values than the existing histogram-based equalization methods.
Contrast enhancement of color images using improved retinex methodeSAT Journals
Abstract Color images provide large information for human visual perception compared to grayscale images. Color image enhancement methods enhance the visual data to increase the clarity of the color image. It increases human perception of information. Different color image contrast enhancement methods are used to increase the contrast of the color images. The Retinex algorithms enhance the color images similar to the scene perceived by the human eye. Multiscale retinex with color restoration (MSRCR) is a type of retinex algorithm. The MSRCR algorithm results in graying out and halo artifacts at the edges of the images. So here the focus is on improving the MSRCR algorithm by combining it with contrast limited adaptive histogram equalization (CLAHE) using image. Keywords: color image enhancement,retinex algorithms
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Image enhancement is one of the challenging issues in image processing. The objective of Image enhancement is to process an image so that result is more suitable than original image for specific application. Digital image enhancement techniques provide a lot of choices for improving the visual quality of images. Appropriate choice of such techniques is very important. This paper will provide an overview and analysis of different techniques commonly used for image enhancement. Image enhancement plays a fundamental role in vision applications. Recently much work is completed in the field of images enhancement. Many techniques have previously been proposed up to now for enhancing the digital images. In this paper, a survey on various image enhancement techniques has been done.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
Breast cancer is one of the most common diseases diagnosed among female cancer patients. Early detection of breast cancer is needed to reduce the risk of fatality of this disease as no cure has been found yet for this illness. This research is conducted to improve the Gradient Vector Flow (GVF) Snake Active Contour segmentation technique in mammography segmentation. Segmentation of the mammogram image is done to segment lesions existence using Chan-Vese Active Contour and Localized Active Contour. Besides that, the effectiveness of these both methods are then compared and chosen to be the best method. Digital Database of Screening Mammograms (DDSM) is used for the purpose of screening. First, the images undergo pre-processing process using the Gaussian Low Pass Filter to remove unwanted noise. After that, contrast enhancement applied to the images. Segmentation of mammograms is then conducted by using Chan-Vese Active Contour and Localized Active Contour method. The result shows that Chan-Vese technique outperforms Localized Active Contour with 90% accuracy.
Engineering Research Publication
Best International Journals, High Impact Journals,
International Journal of Engineering & Technical Research
ISSN : 2321-0869 (O) 2454-4698 (P)
www.erpublication.org
ER Publication,
IJETR, IJMCTR,
Journals,
International Journals,
High Impact Journals,
Monthly Journal,
Good quality Journals,
Research,
Research Papers,
Research Article,
Free Journals, Open access Journals,
erpublication.org,
Engineering Journal,
Science Journals,
An image enhancement method based on gabor filtering in wavelet domain and ad...nooriasukmaningtyas
The images are not always good enough to convey the proper information.
The image may be very bright or very dark sometime or it may be low
contrast or high contrast. Because of these reasons image enhancement plays
important role in digital image processing. In this paper we proposed an
image enhancement technique in which gabor and median filtering is
performed in wavelet domain and adaptive histogram equalization is
performed in spatial domain. Brightness and contrast are the two parameters
Keywords: used for analyzing the performance of the proposed method.
A MODIFIED HISTOGRAM BASED FAST ENHANCEMENT ALGORITHMcsandit
The contrast enhancement of medical images has an important role in diseases diagnostic,
specially, cancer cases. Histogram equalization is considered as the most popular algorithm for
contrast enhancement according to its effectiveness and simplicity. In this paper, we present a
modified version of the Histogram Based Fast Enhancement Algorithm. This algorithm
enhances the areas of interest with less complexity. It is applied only to CT head images and its
idea based on treating with the soft tissues and ignoring other details in the image. The
proposed modification make the algorithm is valid for most CT image types with enhanced
results.
E FFECTIVE P ROCESSING A ND A NALYSIS OF R ADIOTHERAPY I MAGESsipij
a-Si Electronic Portal Imaging Device (EPID) is an
important tool to verify the location of the radiat
ion
therapy beam with respect to the patient anatomy. B
ut, Electronic Portal Images (EPI) suffer from low
contrast. In order to have better in-treatment imag
es to extract relevant features of the anatomy, ima
ge
processing tools need to be integrated in the Radio
logy systems. The goal of this research work is to
inspect
several image processing techniques for contrast en
hancement of electronic portal images and gauge
parameters like mean, variance, standard deviation,
MSE, RMSE, entropy, PSNR, AMBE, normalised cross
correlation, average difference, structural content
(SC), maximum difference and normalised absolute
error (NAE) to study their visual quality improvem
ent. In addition, by adding salt and pepper noise,
Gaussian noise and motion blur, we calculate error
measurement parameters like Universal Image Quality
(UIQ) index, Enhancement Measurement Error (EME), P
earson Correlation Coefficient, SNR and Mean
Absolute error (MAE). The improved results point ou
t that image processing tools need to be incorporat
ed
into radiology for accurate delivery of dose
Optimal Coefficient Selection For Medical Image FusionIJERA Editor
Medical image fusion is one of the major research fields in image processing. Medical imaging has become a
vital component in major clinical applications such as detection/ diagnosis and treatment. Joint analysis of
medical data collected from same patient using different modalities is required in many clinical applications.
This paper introduces an optimal fusion technique for multiscale-decomposition based fusion of medical images
and measuring its performance with existing fusion techniques. This approach incorporates genetic algorithm
for optimal coefficient selection and employ various multiscale filters for noise removal. Experiments
demonstrate that proposed fusion technique generate better results than existing rules. The performance of
proposed system is found to be superior to existing schemes used in this literature.
Enhancement of Medical Images using Histogram Based Hybrid TechniqueINFOGAIN PUBLICATION
Digital Image Processing is very important area of research. A number of techniques are available for image enhancement of gray scale images as well as color images. They work very efficiently for enhancement of the gray scale as well as color images. Important techniques namely Histogram Equalization, BBHE, RSWHE, RSWHE (recursion=2, gamma=No), AGCWD (Recursion=0, gamma=0) have been used quite frequently for image enhancement. But there are some shortcomings of the present techniques. The major shortcoming is that while enhancement, the brightness of the image deteriorates quite a lot. So there was need for some technique for image enhancement so that while enhancement was done, the brightness of the images does not go down. To remove this shortcoming, a new hybrid technique namely RESWHE+AGCWD (recursion=2, gamma=0 or 1) was proposed. The results of the proposed technique were compared with the existing techniques. In the present methodology, the brightness did not decrease during image enhancement. So the results and the technique was validated and accepted. The parameters via PSNR, MSE, AMBE etc. are taken for performance evaluation and validation of the proposed technique against the existing techniques which results in better outperform.
7 ijaems sept-2015-8-design and implementation of fuzzy logic based image fus...INFOGAIN PUBLICATION
The quality of image holds importance for both humans and machines. To fulfill the requirement of good quality images, image enhancement is needed. Application of a single contrast enhancement technique often does not produce desirable result and may lead to over enhanced images. To overcome this problem image fusion is performed so that better results with desired enhancement can be achieved. In the present paper an amalgamation of image enhancement, fusion and sharpening have been carried out in the candidate algorithm. The algorithm makes use of fuzzy logic for weight calculation. The results are compared with DACE/LIF approach and it is observed that the proposed algorithm improves the result in terms of quality parameters like PSNR (Peak Signal to Noise Ratio), AMBE (Absolute Mean Brightness Error) and SSIM (Structural Similarity Index) by 0.5 dB, 3 and 0.1 respectively from the existing technique.
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...ijcseit
A novel Modified Histogram Equalization (MHE) technique for contrast enhancement is proposed in this
paper. This technique modifies the probability density function of an image by introducing constraints prior
to the process of histogram equalization (HE). These constraints are formulated using two parameters
which are optimized using swarm intelligence. This technique of contrast enhancement takes control over
the effect of HE so that it enhances the image without causing any loss to its details. A median adjustment
factor is then added to the result to normalize the change in the luminance level after enhancement. This
factor suppresses the effect of luminance change due to the presence of outlier pixels. The outlier pixels of
highly deviated intensities have greater impact in changing the contrast of an image. This approach
provides a convenient and effective way to control the enhancement process, while being adaptive to
various types of images. Experimental results show that the proposed technique gives better results in
terms of Discrete Entropy and SSIM values than the existing histogram-based equalization methods.
Contrast enhancement of color images using improved retinex methodeSAT Journals
Abstract Color images provide large information for human visual perception compared to grayscale images. Color image enhancement methods enhance the visual data to increase the clarity of the color image. It increases human perception of information. Different color image contrast enhancement methods are used to increase the contrast of the color images. The Retinex algorithms enhance the color images similar to the scene perceived by the human eye. Multiscale retinex with color restoration (MSRCR) is a type of retinex algorithm. The MSRCR algorithm results in graying out and halo artifacts at the edges of the images. So here the focus is on improving the MSRCR algorithm by combining it with contrast limited adaptive histogram equalization (CLAHE) using image. Keywords: color image enhancement,retinex algorithms
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Image enhancement is one of the challenging issues in image processing. The objective of Image enhancement is to process an image so that result is more suitable than original image for specific application. Digital image enhancement techniques provide a lot of choices for improving the visual quality of images. Appropriate choice of such techniques is very important. This paper will provide an overview and analysis of different techniques commonly used for image enhancement. Image enhancement plays a fundamental role in vision applications. Recently much work is completed in the field of images enhancement. Many techniques have previously been proposed up to now for enhancing the digital images. In this paper, a survey on various image enhancement techniques has been done.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
Breast cancer is one of the most common diseases diagnosed among female cancer patients. Early detection of breast cancer is needed to reduce the risk of fatality of this disease as no cure has been found yet for this illness. This research is conducted to improve the Gradient Vector Flow (GVF) Snake Active Contour segmentation technique in mammography segmentation. Segmentation of the mammogram image is done to segment lesions existence using Chan-Vese Active Contour and Localized Active Contour. Besides that, the effectiveness of these both methods are then compared and chosen to be the best method. Digital Database of Screening Mammograms (DDSM) is used for the purpose of screening. First, the images undergo pre-processing process using the Gaussian Low Pass Filter to remove unwanted noise. After that, contrast enhancement applied to the images. Segmentation of mammograms is then conducted by using Chan-Vese Active Contour and Localized Active Contour method. The result shows that Chan-Vese technique outperforms Localized Active Contour with 90% accuracy.
Engineering Research Publication
Best International Journals, High Impact Journals,
International Journal of Engineering & Technical Research
ISSN : 2321-0869 (O) 2454-4698 (P)
www.erpublication.org
ER Publication,
IJETR, IJMCTR,
Journals,
International Journals,
High Impact Journals,
Monthly Journal,
Good quality Journals,
Research,
Research Papers,
Research Article,
Free Journals, Open access Journals,
erpublication.org,
Engineering Journal,
Science Journals,
An image enhancement method based on gabor filtering in wavelet domain and ad...nooriasukmaningtyas
The images are not always good enough to convey the proper information.
The image may be very bright or very dark sometime or it may be low
contrast or high contrast. Because of these reasons image enhancement plays
important role in digital image processing. In this paper we proposed an
image enhancement technique in which gabor and median filtering is
performed in wavelet domain and adaptive histogram equalization is
performed in spatial domain. Brightness and contrast are the two parameters
Keywords: used for analyzing the performance of the proposed method.
A MODIFIED HISTOGRAM BASED FAST ENHANCEMENT ALGORITHMcsandit
The contrast enhancement of medical images has an important role in diseases diagnostic,
specially, cancer cases. Histogram equalization is considered as the most popular algorithm for
contrast enhancement according to its effectiveness and simplicity. In this paper, we present a
modified version of the Histogram Based Fast Enhancement Algorithm. This algorithm
enhances the areas of interest with less complexity. It is applied only to CT head images and its
idea based on treating with the soft tissues and ignoring other details in the image. The
proposed modification make the algorithm is valid for most CT image types with enhanced
results.
E FFECTIVE P ROCESSING A ND A NALYSIS OF R ADIOTHERAPY I MAGESsipij
a-Si Electronic Portal Imaging Device (EPID) is an
important tool to verify the location of the radiat
ion
therapy beam with respect to the patient anatomy. B
ut, Electronic Portal Images (EPI) suffer from low
contrast. In order to have better in-treatment imag
es to extract relevant features of the anatomy, ima
ge
processing tools need to be integrated in the Radio
logy systems. The goal of this research work is to
inspect
several image processing techniques for contrast en
hancement of electronic portal images and gauge
parameters like mean, variance, standard deviation,
MSE, RMSE, entropy, PSNR, AMBE, normalised cross
correlation, average difference, structural content
(SC), maximum difference and normalised absolute
error (NAE) to study their visual quality improvem
ent. In addition, by adding salt and pepper noise,
Gaussian noise and motion blur, we calculate error
measurement parameters like Universal Image Quality
(UIQ) index, Enhancement Measurement Error (EME), P
earson Correlation Coefficient, SNR and Mean
Absolute error (MAE). The improved results point ou
t that image processing tools need to be incorporat
ed
into radiology for accurate delivery of dose
MODIFIED HISTOGRAM EQUALIZATION FOR IMAGE CONTRAST ENHANCEMENT USING PARTICLE...ijcseit
A novel Modified Histogram Equalization (MHE) technique for contrast enhancement is proposed in this paper. This technique modifies the probability density function of an image by introducing constraints prior to the process of histogram equalization (HE). These constraints are formulated using two parameters which are optimized using swarm intelligence. This technique of contrast enhancement takes control over
the effect of HE so that it enhances the image without causing any loss to its details. A median adjustment factor is then added to the result to normalize the change in the luminance level after enhancement. This factor suppresses the effect of luminance change due to the presence of outlier pixels. The outlier pixels of highly deviated intensities have greater impact in changing the contrast of an image. This approach provides a convenient and effective way to control the enhancement process, while being adaptive to various types of images. Experimental results show that the proposed technique gives better results in
terms of Discrete Entropy and SSIM values than the existing histogram-based equalization methods.
A MODIFIED HISTOGRAM BASED FAST ENHANCEMENT ALGORITHMcscpconf
The contrast enhancement of medical images has an important role in diseases diagnostic,
specially, cancer cases. Histogram equalization is considered as the most popular algorithm for
contrast enhancement according to its effectiveness and simplicity. In this paper, we present a
modified version of the Histogram Based Fast Enhancement Algorithm. This algorithm
enhances the areas of interest with less complexity. It is applied only to CT head images and its
idea based on treating with the soft tissues and ignoring other details in the image. The
proposed modification make the algorithm is valid for most CT image types with enhanced
results.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
COLOUR IMAGE ENHANCEMENT BASED ON HISTOGRAM EQUALIZATIONecij
Histogram equalization is a nonlinear technique for adjusting the contrast of an image using its
histogram. It increases the brightness of a gray scale image which is different from the mean brightness of
the original image. There are various types of Histogram equalization techniques like Histogram
Equalization, Contrast Limited Adaptive Histogram Equalization, Brightness Preserving Bi Histogram
Equalization, Dualistic Sub Image Histogram Equalization, Minimum Mean Brightness Error Bi
Histogram Equalization, Recursive Mean Separate Histogram Equalization and Recursive Sub Image
Histogram Equalization. In this paper, the histogram equalization approach of gray-level images is
extended for colour images. The acquired image is converted into HSV (Hue, Saturation, Value). The
image is then decomposed into two parts by using exposure threshold and then equalized them
independently Over enhancement is also controlled in this method by using clipping threshold. For
measuring the performance of the enhanced image, entropy and contrast are calculated.
COLOUR IMAGE ENHANCEMENT BASED ON HISTOGRAM EQUALIZATIONecij
Histogram equalization is a nonlinear technique for adjusting the contrast of an image using its histogram. It increases the brightness of a gray scale image which is different from the mean brightness of the original image. There are various types of Histogram equalization techniques like Histogram Equalization, Contrast Limited Adaptive Histogram Equalization, Brightness Preserving Bi Histogram Equalization, Dualistic Sub Image Histogram Equalization, Minimum Mean Brightness Error Bi Histogram Equalization, Recursive Mean Separate Histogram Equalization and Recursive Sub Image Histogram Equalization. In this paper, the histogram equalization approach of gray-level images is extended for colour images. The acquired image is converted into HSV (Hue, Saturation, Value). The image is then decomposed into two parts by using exposure threshold and then equalized them independently Over enhancement is also controlled in this method by using clipping threshold. For
measuring the performance of the enhanced image, entropy and contrast are calculated.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
call for paper 2012, hard copy of journal, research paper publishing, where to publish research paper,
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Segmentation of Tumor Region in MRI Images of Brain using Mathematical Morpho...CSCJournals
This paper introduces an efficient detection of brain tumor from cerebral MRI images. The methodology consists of two steps: enhancement and segmentation. To improve the quality of images and limit the risk of distinct regions fusion in the segmentation phase an enhancement process is applied. We applied mathematical morphology to increase the contrast in MRI images and to segment MRI images. Some of experimental results on brain images show the feasibility and the performance of the proposed approach.
Similar to Optimized Histogram Based Contrast Limited Enhancement for Mammogram Images (20)
Power System State Estimation - A ReviewIDES Editor
The aim of this article is to provide a comprehensive
survey on power system state estimation techniques. The
algorithms used for finding the system states under both static
and dynamic state estimations are discussed in brief. The
authors are opinion that the scope of pursuing research in the
area of state estimation with PMU and SCADA measurements
is the state of the art and timely.
Artificial Intelligence Technique based Reactive Power Planning Incorporating...IDES Editor
Reactive Power Planning is a major concern in the
operation and control of power systems This paper compares
the effectiveness of Evolutionary Programming (EP) and
New Improved Differential Evolution (NIMDE) to solve
Reactive Power Planning (RPP) problem incorporating
FACTS Controllers like Static VAR Compensator (SVC),
Thyristor Controlled Series Capacitor (TCSC) and Unified
power flow controller (UPFC) considering voltage stability.
With help of Fast Voltage Stability Index (FVSI), the critical
lines and buses are identified to install the FACTS controllers.
The optimal settings of the control variables of the generator
voltages,transformer tap settings and allocation and parameter
settings of the SVC,TCSC,UPFC are considered for reactive
power planning. The test and Validation of the proposed
algorithm are conducted on IEEE 30–bus system and 72-bus
Indian system.Simulation results shows that the UPFC gives
better results than SVC and TCSC and the FACTS controllers
reduce the system losses.
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...IDES Editor
Damping of power system oscillations with the help
of proposed optimal Proportional Integral Derivative Power
System Stabilizer (PID-PSS) and Static Var Compensator
(SVC)-based controllers are thoroughly investigated in this
paper. This study presents robust tuning of PID-PSS and
SVC-based controllers using Genetic Algorithms (GA) in
multi machine power systems by considering detailed model
of the generators (model 1.1). The effectiveness of FACTSbased
controllers in general and SVC-based controller in
particular depends upon their proper location. Modal
controllability and observability are used to locate SVC–based
controller. The performance of the proposed controllers is
compared with conventional lead-lag power system stabilizer
(CPSS) and demonstrated on 10 machines, 39 bus New England
test system. Simulation studies show that the proposed genetic
based PID-PSS with SVC based controller provides better
performance.
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...IDES Editor
This paper presents the need to operate the power
system economically and with optimum levels of voltages has
further led to an increase in interest in Distributed
Generation. In order to reduce the power losses and to improve
the voltage in the distribution system, distributed generators
(DGs) are connected to load bus. To reduce the total power
losses in the system, the most important process is to identify
the proper location for fixing and sizing of DGs. It presents a
new methodology using a new population based meta heuristic
approach namely Artificial Bee Colony algorithm(ABC) for
the placement of Distributed Generators(DG) in the radial
distribution systems to reduce the real power losses and to
improve the voltage profile, voltage sag mitigation. The power
loss reduction is important factor for utility companies because
it is directly proportional to the company benefits in a
competitive electricity market, while reaching the better power
quality standards is too important as it has vital effect on
customer orientation. In this paper an ABC algorithm is
developed to gain these goals all together. In order to evaluate
sag mitigation capability of the proposed algorithm, voltage
in voltage sensitive buses is investigated. An existing 20KV
network has been chosen as test network and results are
compared with the proposed method in the radial distribution
system.
Line Losses in the 14-Bus Power System Network using UPFCIDES Editor
Controlling power flow in modern power systems
can be made more flexible by the use of recent developments
in power electronic and computing control technology. The
Unified Power Flow Controller (UPFC) is a Flexible AC
transmission system (FACTS) device that can control all the
three system variables namely line reactance, magnitude and
phase angle difference of voltage across the line. The UPFC
provides a promising means to control power flow in modern
power systems. Essentially the performance depends on proper
control setting achievable through a power flow analysis
program. This paper presents a reliable method to meet the
requirements by developing a Newton-Raphson based load
flow calculation through which control settings of UPFC can
be determined for the pre-specified power flow between the
lines. The proposed method keeps Newton-Raphson Load Flow
(NRLF) algorithm intact and needs (little modification in the
Jacobian matrix). A MATLAB program has been developed to
calculate the control settings of UPFC and the power flow
between the lines after the load flow is converged. Case studies
have been performed on IEEE 5-bus system and 14-bus system
to show that the proposed method is effective. These studies
indicate that the method maintains the basic NRLF properties
such as fast computational speed, high degree of accuracy and
good convergence rate.
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...IDES Editor
The size and shape of opening in dam causes the
stress concentration, it also causes the stress variation in the
rest of the dam cross section. The gravity method of the analysis
does not consider the size of opening and the elastic property
of dam material. Thus the objective of study is comprises of
the Finite Element Method which considers the size of
opening, elastic property of material, and stress distribution
because of geometric discontinuity in cross section of dam.
Stress concentration inside the dam increases with the opening
in dam which results in the failure of dam. Hence it is
necessary to analyses large opening inside the dam. By making
the percentage area of opening constant and varying size and
shape of opening the analysis is carried out. For this purpose
a section of Koyna Dam is considered. Dam is defined as a
plane strain element in FEM, based on geometry and loading
condition. Thus this available information specified our path
of approach to carry out 2D plane strain analysis. The results
obtained are then compared mutually to get most efficient
way of providing large opening in the gravity dam.
Assessing Uncertainty of Pushover Analysis to Geometric ModelingIDES Editor
Pushover Analysis a popular tool for seismic
performance evaluation of existing and new structures and is
nonlinear Static procedure where in monotonically increasing
loads are applied to the structure till the structure is unable
to resist the further load .During the analysis, whatever the
strength of concrete and steel is adopted for analysis of
structure may not be the same when real structure is
constructed and the pushover analysis results are very sensitive
to material model adopted, geometric model adopted, location
of plastic hinges and in general to procedure followed by the
analyzer. In this paper attempt has been made to assess
uncertainty in pushover analysis results by considering user
defined hinges and frame modeled as bare frame and frame
with slab modeled as rigid diaphragm and results compared
with experimental observations. Uncertain parameters
considered includes the strength of concrete, strength of steel
and cover to the reinforcement which are randomly generated
and incorporated into the analysis. The results are then
compared with experimental observations.
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...IDES Editor
This paper is an attempt to base on auctions which
presents a frame work for the secure multi-party decision
protocols. In addition to the implementations which are very
light weighted, the main focus is on synchronizing security
features for avoiding agreements manipulations and reducing
the user traffic. Through this paper one can understand that
this different auction protocols on top of the frame work can
be collaborated using mobile devices. This paper present the
negotiation between auctioneer and the proffered and this
negotiation shows that multiparty security is far better than
the existing system.
Selfish Node Isolation & Incentivation using Progressive ThresholdsIDES Editor
The problems associated with selfish nodes in
MANET are addressed by a collaborative watchdog approach
which reduces the detection time for selfish nodes thereby
improves the performance and accuracy of watchdogs[1]. In
the related works they make use of credit based systems, reputation
based mechanisms, pathrater and watchdog mechanism
to detect such selfish nodes. In this paper we follow an approach
of collaborative watchdog which reduces the detection
time for selfish nodes and also involves the removal of such
selfish nodes based on some progressively assessed thresholds.
The threshold gives the nodes a chance to stop misbehaving
before it is permanently deleted from the network.
The node passes through several isolation processes before it
is permanently removed. Another version of AODV protocol
is used here which allows the simulation of selfish nodes in
NS2 by adding or modifying log files in the protocol.
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...IDES Editor
Wireless sensor networks are networks having non
wired infrastructure and dynamic topology. In OSI model each
layer is prone to various attacks, which halts the performance
of a network .In this paper several attacks on four layers of
OSI model are discussed and security mechanism is described
to prevent attack in network layer i.e wormhole attack. In
Wormhole attack two or more malicious nodes makes a covert
channel which attracts the traffic towards itself by depicting a
low latency link and then start dropping and replaying packets
in the multi-path route. This paper proposes promiscuous mode
method to detect and isolate the malicious node during
wormhole attack by using Ad-hoc on demand distance vector
routing protocol (AODV) with omnidirectional antenna. The
methodology implemented notifies that the nodes which are
not participating in multi-path routing generates an alarm
message during delay and then detects and isolate the
malicious node from network. We also notice that not only
the same kind of attacks but also the same kind of
countermeasures can appear in multiple layer. For example,
misbehavior detection techniques can be applied to almost all
the layers we discussed.
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...IDES Editor
The recent advancements in the wireless technology
and their wide-spread deployment have made remarkable
enhancements in efficiency in the corporate and industrial
and Military sectors The increasing popularity and usage of
wireless technology is creating a need for more secure wireless
Ad hoc networks. This paper aims researched and developed
a new protocol that prevents wormhole attacks on a ad hoc
network. A few existing protocols detect wormhole attacks but
they require highly specialized equipment not found on most
wireless devices. This paper aims to develop a defense against
wormhole attacks as an Anti-worm protocol which is based on
responsive parameters, that does not require as a significant
amount of specialized equipment, trick clock synchronization,
no GPS dependencies.
Cloud Security and Data Integrity with Client Accountability FrameworkIDES Editor
The Cloud based services provide much efficient
and seamless ways for data sharing across the cloud. The fact
that the data owners no longer possess data makes it very
difficult to assure data confidentiality and to enable secure
data sharing in the cloud. Despite of all its advantages this
will remain a major limitation that acts as a barrier to the
wider deployment of cloud based services. One of the possible
ways for ensuring trust in this aspect is the introduction of
accountability feature in the cloud computing scenario. The
Cloud framework requires promotion of distributed
accountability for such dynamic environment[1]. In some
works, there‘s an accountable framework suggested to ensure
distributed accountability for data sharing by the generation
of only a log of data access, but without any embedded feedback
mechanism for owner permission towards data
protection[2].The proposed system is an enhanced client
accountability framework which provides an additional client
side verification for each access towards enhanced security of
data. The integrity of content of data which resides in the
cloud service provider is also maintained by secured
outsourcing. Besides, the authentication of JAR(Java Archive)
files are done to ensure file protection and to maintain a safer
environment for data sharing. The analysis of various
functionalities of the framework depicts both the
accountability and security feature in an efficient manner.
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetIDES Editor
A System state in HTTP botnet uses HTTP protocol
for the creation of chain of Botnets thereby compromising
other systems. By using HTTP protocol and port number 80,
attacks can not only be hidden but also pass through the
firewall without being detected. The DPR based detection
leads to better analysis of botnet attacks [3]. However, it
provides only probabilistic detection of the attacker and also
time consuming and error prone. This paper proposes a Genetic
algorithm based layered approach for detecting as well as
preventing botnet attacks. The paper reviews p2p firewall
implementation which forms the basis of filtering.
Performance evaluation is done based on precision, F-value
and probability. Layered approach reduces the computation
and overall time requirement [7]. Genetic algorithm promises
a low false positive rate.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
in cloud computing data storage is a significant issue
because the entire data reside over a set of interconnected
resource pools that enables the data to be accessed through
virtual machines. It moves the application software’s and
databases to the large data centers where the management of
data is actually done. As the resource pools are situated over
various corners of the world, the management of data and
services may not be fully trustworthy. So, there are various
issues that need to be addressed with respect to the
management of data, service of data, privacy of data, security
of data etc. But the privacy and security of data is highly
challenging. To ensure privacy and security of data-at-rest in
cloud computing, we have proposed an effective and a novel
approach to ensure data security in cloud computing by means
of hiding data within images following is the concept of
steganography. The main objective of this paper is to prevent
data access from cloud data storage centers by unauthorized
users. This scheme perfectly stores data at cloud data storage
centers and retrieves data from it when it is needed.
The main tasks of a Wireless Sensor Network
(WSN) are data collection from its nodes and communication
of this data to the base station (BS). The protocols used for
communication among the WSN nodes and between the WSN
and the BS, must consider the resource constraints of nodes,
battery energy, computational capabilities and memory. The
WSN applications involve unattended operation of the network
over an extended period of time. In order to extend the lifetime
of a WSN, efficient routing protocols need to be adopted. The
proposed low power routing protocol based on tree-based
network structure reliably forwards the measured data towards
the BS using TDMA. An energy consumption analysis of the
WSN making use of this protocol is also carried out. It is
found that the network is energy efficient with an average
duty cycle of 0:7% for the WSN nodes. The OmNET++
simulation platform along with MiXiM framework is made
use of.
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...IDES Editor
The security of authentication of internet based
co-banking services should not be susceptible to high risks.
The passwords are highly vulnerable to virus attacks due to
the lack of high end embedding of security methods. In order
for the passwords to be more secure, people are generally
compelled to select jumbled up character based passwords
which are not only less memorable but are also equally prone
to insecurity. Multiple use of distributed shares has been
studied to solve the problem of authentication by algorithms
based on thresholding of pixels in image processing and visual
cryptography concepts where the subset of shares is considered
for the recovery of the original image for authentication using
correlation function[1][2].The main disadvantage in the above
study is the plain storage of shares and also one of the shares
is being supplied to the customer, which will lead to the
possibility of misuse by a third party. This paper proposes a
technique for scrambling of pixels by key based random
permutation (KBRP) within the shares before the
authentication has been attempted. Total number of shares to
be created is dependent on the multiplicity of ownership of
the account. By this method the problem of uncertainty among
the customers with regard to security, storage, retrieval of
holding of half of the shares is minimized.
This paper presents a trifocal Rotman Lens Design
approach. The effects of focal ratio and element spacing on
the performance of Rotman Lens are described. A three beam
prototype feeding 4 element antenna array working in L-band
has been simulated using RLD v1.7 software. Simulated
results show that the simulated lens has a return loss of –
12.4dB at 1.8GHz. Beam to array port phase error variation
with change in the focal ratio and element spacing has also
been investigated.
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesIDES Editor
Hyperspectral images can be efficiently compressed
through a linear predictive model, as for example the one
used in the SLSQ algorithm. In this paper we exploit this
predictive model on the AVIRIS images by individuating,
through an off-line approach, a common subset of bands, which
are not spectrally related with any other bands. These bands
are not useful as prediction reference for the SLSQ 3-D
predictive model and we need to encode them via other
prediction strategies which consider only spatial correlation.
We have obtained this subset by clustering the AVIRIS bands
via the clustering by compression approach. The main result
of this paper is the list of the bands, not related with the
others, for AVIRIS images. The clustering trees obtained for
AVIRIS and the relationship among bands they depict is also
an interesting starting point for future research.
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...IDES Editor
A microelectronic circuit of block-elements
functionally analogous to two hydrogen bonding networks is
investigated. The hydrogen bonding networks are extracted
from â-lactamase protein and are formed in its active site.
Each hydrogen bond of the network is described in equivalent
electrical circuit by three or four-terminal block-element.
Each block-element is coded in Matlab. Static and dynamic
analyses are performed. The resultant microelectronic circuit
analogous to the hydrogen bonding network operates as
current mirror, sine pulse source, triangular pulse source as
well as signal modulator.
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...IDES Editor
In this paper a method is proposed to discriminate
real world scenes in to natural and manmade scenes of similar
depth. Global-roughness of a scene image varies as a function
of image-depth. Increase in image depth leads to increase in
roughness in manmade scenes; on the contrary natural scenes
exhibit smooth behavior at higher image depth. This particular
arrangement of pixels in scene structure can be well explained
by local texture information in a pixel and its neighborhood.
Our proposed method analyses local texture information of a
scene image using texture unit matrix. For final classification
we have used both supervised and unsupervised learning using
K-Nearest Neighbor classifier (KNN) and Self Organizing
Map (SOM) respectively. This technique is useful for online
classification due to very less computational complexity.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on: