In this paper scrutinizes image compression using Halftoning Based Block Truncation Coding for color image. Many algorithms were selected likely the original Block Truncation coding, Ordered Dither Block Truncation Coding, Error Diffusion Block Truncation Coding , and Dot Diffused Block Truncation Coding . These above techniques are divided image into non overlapping blocks. BTC acts as the basic compression technique but it exhibits two disadvantages such as the false contour and blocking effect. Hence halftoning based block truncation coding (HBTC) is used to overcome the two issues. Objective measures are used to evaluate the image degree of excellence such as Peak Signal to Noise Ratio, Mean Square Error, Structural Similarity Index and Compression Ratio. At the end, conclusions have shown that the Dot Diffused Block Truncation Coding algorithm outperforms the Block Truncation Coding as well as Error Diffusion Block Truncation Coding.
A COMPARATIVE STUDY ON IMAGE COMPRESSION USING HALFTONING BASED BLOCK TRUNCAT...IJCI JOURNAL
This document summarizes a research paper that compares different techniques for image compression using halftoning-based block truncation coding (HBTC) for color images. It describes the basic block truncation coding (BTC) method and its limitations. It then discusses several HBTC methods - ordered dither block truncation coding, error diffusion block truncation coding, and dot diffused block truncation coding - and evaluates their performance based on metrics like PSNR, MSE, compression ratio, and SSIM. The results show that dot diffused block truncation coding provides better visual quality than the other methods.
Halftoning-based BTC image reconstruction using patch processing with border ...TELKOMNIKA JOURNAL
This paper presents a new halftoning-based block truncation coding (HBTC) image reconstruction using sparse representation framework. The HBTC is a simple yet powerful image compression technique, which can effectively remove the typical blocking effect and false contour. Two types of HBTC methods are discussed in this paper, i.e., ordered dither block truncation coding (ODBTC) and error diffusion block truncation coding (EDBTC). The proposed sparsity-based method suppresses the impulsive noise on ODBTC and EDBTC decoded image with a coupled dictionary containing the HBTC image component and the clean image component dictionaries. Herein, a sparse coefficient is estimated from the HBTC decoded image by means of the HBTC image dictionary. The reconstructed image is subsequently built and aligned from the clean, i.e. non-compressed image dictionary and predicted sparse coefficient. To further reduce the blocking effect, the image patch is firstly identified as “border” and “non-border” type before applying the sparse representation framework. Adding the Laplacian prior knowledge on HBTC decoded image, it yields better reconstructed image quality. The experimental results demonstrate the effectiveness of the proposed HBTC image reconstruction. The proposed method also outperforms the former schemes in terms of reconstructed image quality.
The document discusses a method for compressing color images using block truncation coding (BTC) and genetic algorithms. BTC works by dividing images into blocks and quantizing each block to a high or low value based on the block's mean. This reduces quality issues with BTC. Color images have correlated red, green, and blue planes. The method uses a common bit plane optimized with genetic algorithms to represent all three color planes, improving quality and compression ratio compared to standard BTC and error diffused BTC. Experimental results showed the proposed method provided higher quality reconstructed images as measured by peak signal-to-noise ratio.
This document discusses parallel processing and compound image compression techniques. It examines the computational complexity and quantitative optimization of various image compression algorithms like BTC, DCT, DWT, DTCWT, SPIHT and EZW. The performance is evaluated in terms of coding efficiency, memory usage, image quality and quantity. Block Truncation Coding and Discrete Cosine Transform compression methods are described in more detail.
Modified Skip Line Encoding for Binary Image Compressionidescitation
This paper proposes a modified skip line encoding technique for lossless compression of binary images. Skip line encoding exploits correlation between successive scan lines by encoding only one line and skipping similar lines. The proposed technique improves upon existing skip line encoding by allowing a scan line to be skipped if a similar line exists anywhere in the image, rather than just successive lines. Experimental results on sample images show the modified technique achieves higher compression ratios than conventional skip line encoding.
IRJET- Color Image Compression using Canonic Signed Digit and Block based...IRJET Journal
This document proposes a color image compression technique using canonical signed digit (CSD) and multi-level block truncation coding (BTC). It begins by introducing the need for image compression and discusses existing techniques like discrete wavelet transform (DWT) and BTC. The proposed technique is then described, which uses CSD to implement DWT convolution and multi-level BTC to compress color image channels. Simulation results on different images show improved peak signal-to-noise ratio and computation time compared to other techniques. In conclusion, the CSD and multi-level BTC approach achieves good compression performance for color images.
This document discusses Block Truncation Coding (BTC), a simple lossy image compression technique. BTC works by dividing an image into non-overlapping blocks and quantizing each block into a single bit based on the mean and standard deviation of pixel values in that block. The algorithm, advantages of low complexity and preserving edges, and disadvantages of high bitrates and blocky artifacts are described. Variants like Absolute Moment BTC and Adaptive BTC that improve upon the basic technique are also covered. In conclusion, while newer standards have emerged, BTC remains useful for applications requiring low complexity and moderate compression rates.
Wavelet based Image Coding Schemes: A Recent Survey ijsc
A variety of new and powerful algorithms have been developed for image compression over the years. Among them the wavelet-based image compression schemes have gained much popularity due to their overlapping nature which reduces the blocking artifacts that are common phenomena in JPEG compression and multiresolution character which leads to superior energy compaction with high quality reconstructed images. This paper provides a detailed survey on some of the popular wavelet coding techniques such as the Embedded Zerotree Wavelet (EZW) coding, Set Partitioning in Hierarchical Tree (SPIHT) coding, the Set Partitioned Embedded Block (SPECK) Coder, and the Embedded Block Coding with Optimized Truncation (EBCOT) algorithm. Other wavelet-based coding techniques like the Wavelet Difference Reduction (WDR) and the Adaptive Scanned Wavelet Difference Reduction (ASWDR) algorithms, the Space Frequency Quantization (SFQ) algorithm, the Embedded Predictive Wavelet Image Coder (EPWIC), Compression with Reversible Embedded Wavelet (CREW), the Stack-Run (SR) coding and the recent Geometric Wavelet (GW) coding are also discussed. Based on the review, recommendations and discussions are presented for algorithm development and implementation.
A COMPARATIVE STUDY ON IMAGE COMPRESSION USING HALFTONING BASED BLOCK TRUNCAT...IJCI JOURNAL
This document summarizes a research paper that compares different techniques for image compression using halftoning-based block truncation coding (HBTC) for color images. It describes the basic block truncation coding (BTC) method and its limitations. It then discusses several HBTC methods - ordered dither block truncation coding, error diffusion block truncation coding, and dot diffused block truncation coding - and evaluates their performance based on metrics like PSNR, MSE, compression ratio, and SSIM. The results show that dot diffused block truncation coding provides better visual quality than the other methods.
Halftoning-based BTC image reconstruction using patch processing with border ...TELKOMNIKA JOURNAL
This paper presents a new halftoning-based block truncation coding (HBTC) image reconstruction using sparse representation framework. The HBTC is a simple yet powerful image compression technique, which can effectively remove the typical blocking effect and false contour. Two types of HBTC methods are discussed in this paper, i.e., ordered dither block truncation coding (ODBTC) and error diffusion block truncation coding (EDBTC). The proposed sparsity-based method suppresses the impulsive noise on ODBTC and EDBTC decoded image with a coupled dictionary containing the HBTC image component and the clean image component dictionaries. Herein, a sparse coefficient is estimated from the HBTC decoded image by means of the HBTC image dictionary. The reconstructed image is subsequently built and aligned from the clean, i.e. non-compressed image dictionary and predicted sparse coefficient. To further reduce the blocking effect, the image patch is firstly identified as “border” and “non-border” type before applying the sparse representation framework. Adding the Laplacian prior knowledge on HBTC decoded image, it yields better reconstructed image quality. The experimental results demonstrate the effectiveness of the proposed HBTC image reconstruction. The proposed method also outperforms the former schemes in terms of reconstructed image quality.
The document discusses a method for compressing color images using block truncation coding (BTC) and genetic algorithms. BTC works by dividing images into blocks and quantizing each block to a high or low value based on the block's mean. This reduces quality issues with BTC. Color images have correlated red, green, and blue planes. The method uses a common bit plane optimized with genetic algorithms to represent all three color planes, improving quality and compression ratio compared to standard BTC and error diffused BTC. Experimental results showed the proposed method provided higher quality reconstructed images as measured by peak signal-to-noise ratio.
This document discusses parallel processing and compound image compression techniques. It examines the computational complexity and quantitative optimization of various image compression algorithms like BTC, DCT, DWT, DTCWT, SPIHT and EZW. The performance is evaluated in terms of coding efficiency, memory usage, image quality and quantity. Block Truncation Coding and Discrete Cosine Transform compression methods are described in more detail.
Modified Skip Line Encoding for Binary Image Compressionidescitation
This paper proposes a modified skip line encoding technique for lossless compression of binary images. Skip line encoding exploits correlation between successive scan lines by encoding only one line and skipping similar lines. The proposed technique improves upon existing skip line encoding by allowing a scan line to be skipped if a similar line exists anywhere in the image, rather than just successive lines. Experimental results on sample images show the modified technique achieves higher compression ratios than conventional skip line encoding.
IRJET- Color Image Compression using Canonic Signed Digit and Block based...IRJET Journal
This document proposes a color image compression technique using canonical signed digit (CSD) and multi-level block truncation coding (BTC). It begins by introducing the need for image compression and discusses existing techniques like discrete wavelet transform (DWT) and BTC. The proposed technique is then described, which uses CSD to implement DWT convolution and multi-level BTC to compress color image channels. Simulation results on different images show improved peak signal-to-noise ratio and computation time compared to other techniques. In conclusion, the CSD and multi-level BTC approach achieves good compression performance for color images.
This document discusses Block Truncation Coding (BTC), a simple lossy image compression technique. BTC works by dividing an image into non-overlapping blocks and quantizing each block into a single bit based on the mean and standard deviation of pixel values in that block. The algorithm, advantages of low complexity and preserving edges, and disadvantages of high bitrates and blocky artifacts are described. Variants like Absolute Moment BTC and Adaptive BTC that improve upon the basic technique are also covered. In conclusion, while newer standards have emerged, BTC remains useful for applications requiring low complexity and moderate compression rates.
Wavelet based Image Coding Schemes: A Recent Survey ijsc
A variety of new and powerful algorithms have been developed for image compression over the years. Among them the wavelet-based image compression schemes have gained much popularity due to their overlapping nature which reduces the blocking artifacts that are common phenomena in JPEG compression and multiresolution character which leads to superior energy compaction with high quality reconstructed images. This paper provides a detailed survey on some of the popular wavelet coding techniques such as the Embedded Zerotree Wavelet (EZW) coding, Set Partitioning in Hierarchical Tree (SPIHT) coding, the Set Partitioned Embedded Block (SPECK) Coder, and the Embedded Block Coding with Optimized Truncation (EBCOT) algorithm. Other wavelet-based coding techniques like the Wavelet Difference Reduction (WDR) and the Adaptive Scanned Wavelet Difference Reduction (ASWDR) algorithms, the Space Frequency Quantization (SFQ) algorithm, the Embedded Predictive Wavelet Image Coder (EPWIC), Compression with Reversible Embedded Wavelet (CREW), the Stack-Run (SR) coding and the recent Geometric Wavelet (GW) coding are also discussed. Based on the review, recommendations and discussions are presented for algorithm development and implementation.
Review On Fractal Image Compression TechniquesIRJET Journal
This document reviews different techniques for fractal image compression. It discusses how fractal image compression works by dividing an image into range and domain blocks. It then summarizes several papers that propose techniques for fractal image compression using discrete cosine transform (DCT) or discrete wavelet transform (DWT). These techniques aim to improve compression ratio and reduce encoding time. Finally, the document proposes a new method that combines wavelets with fractal image compression to further increase compression ratio while maintaining low image quality losses during decompression.
International Journal on Soft Computing ( IJSC )ijsc
This document provides a summary of various wavelet-based image coding schemes. It discusses the basics of image compression including transformation, quantization, and encoding. It then reviews the wavelet transform approach and several popular wavelet coding techniques such as EZW, SPIHT, SPECK, EBCOT, WDR, ASWDR, SFQ, EPWIC, CREW, SR and GW coding. These techniques exploit the multi-resolution properties of wavelets for superior energy compaction and compression performance compared to DCT-based methods like JPEG. The document provides details on how each coding scheme works and compares their features.
This document summarizes an article that proposes modifications to the JPEG 2000 image compression standard to achieve higher compression ratios while maintaining acceptable error rates. The proposed Adaptive JPEG 2000 technique involves pre-processing images with a transfer function to make them more suitable for compression by JPEG 2000. This is intended to provide higher compression ratios than the original JPEG 2000 standard while keeping root mean square error within allowed thresholds. The document provides background on JPEG 2000 and lossy image compression techniques, describes the proposed pre-processing approach, and indicates it was tested on single-channel images.
This document summarizes a research paper that analyzed medical image compression using discrete cosine transform (DCT) with entropy encoding and Huffman coding on MRI brain images. The paper implemented DCT to compress images at varying quality levels and block sizes. It also used Huffman encoding to assign shorter bit codes to more frequent symbols. The research tested the algorithms on a set of brain MRI images. Results showed that compression ratio increased with higher quality levels, while peak signal-to-noise ratio and mean squared error varied based on the technique and block size used. The paper concluded that DCT with entropy decomposition can effectively compress MRI images with less quality loss.
Digital Image Compression using Hybrid Scheme using DWT and Quantization wit...IRJET Journal
This document discusses a hybrid image compression technique using both discrete cosine transform (DCT) and discrete wavelet transform (DWT). It begins with an introduction to image compression and its goals of reducing file size while maintaining quality. Next, it outlines the proposed hybrid compression method, which applies DWT to blocks of the image, then DCT to the approximation coefficients from DWT. This is intended to achieve higher compression ratios than DCT or DWT alone, with fewer blocking artifacts and false contours. Simulation results on various test images show the hybrid method provides higher PSNR and lower MSE than the individual transforms, demonstrating it outperforms them in terms of both quality and compression. The document concludes the hybrid approach is more suitable for
A Review on Image Compression using DCT and DWTIJSRD
This document reviews image compression techniques using discrete cosine transform (DCT) and discrete wavelet transform (DWT). It discusses how DCT transforms images from spatial to frequency domains, allowing for energy compaction and efficient encoding. DWT is a multi-resolution technique that represents images at different frequency bands. The document analyzes various studies that have used DCT and DWT for compression and compares their performance in terms of metrics like peak signal-to-noise ratio and compression ratio. It finds that DWT generally provides better compression performance than DCT, though DCT requires less computational resources. A hybrid DCT-DWT technique is also proposed to combine the advantages of both methods.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Feature Extraction in Content based Image Retrievalijcnes
This document discusses feature extraction methods for content-based image retrieval (CBIR). It describes Order Dither Block Truncation Coding (ODBTC), an image compression technique that can be used to extract image features without decoding. The proposed CBIR system extracts two features from ODBTC-encoded images: Color Co-occurrence Feature (CCF) and Bit Pattern Feature (BPF). CCF is extracted from color quantizers produced by ODBTC, while BPF is based on a bit pattern codebook generated from ODBTC bitmap images. Experimental results show this approach provides superior retrieval accuracy compared to earlier CBIR methods.
Comparison of different Fingerprint Compression Techniquessipij
The important features of wavelet transform and different methods in compression of fingerprint images have been implemented. Image quality is measured objectively using peak signal to noise ratio (PSNR) and mean square error (MSE).A comparative study using discrete cosine transform based Joint Photographic Experts Group(JPEG) standard , wavelet based basic Set Partitioning in Hierarchical trees(SPIHT) and Modified SPIHT is done. The comparison shows that Modified SPIHT offers better compression than basic SPIHT and JPEG. The results will help application developers to choose a good wavelet compression system for their applications.
BIG DATA-DRIVEN FAST REDUCING THE VISUAL BLOCK ARTIFACTS OF DCT COMPRESSED IM...IJDKP
1) The document proposes a new simple method to reduce visual block artifacts in images compressed using DCT (used in JPEG) for urban surveillance systems.
2) The method smooths only the connection edges between adjacent blocks while keeping other image areas unchanged.
3) Simulation results show the proposed method achieves better image quality as measured by PSNR compared to median and wiener filters, while using significantly less computational resources.
Performance Analysis of Compression Techniques Using SVD, BTC, DCT and GPIOSR Journals
This document summarizes and compares four different image compression techniques: Singular Value Decomposition (SVD), Block Truncation Coding (BTC), Discrete Cosine Transform (DCT), and Gaussian Pyramid (GP). It analyzes the performance of each technique based on metrics like Peak Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and compression ratio. The techniques are tested on biometric images from iris, fingerprint, and palm print databases to evaluate image quality after compression.
This document summarizes and compares four different image compression techniques: Singular Value Decomposition (SVD), Block Truncation Coding (BTC), Discrete Cosine Transform (DCT), and Gaussian Pyramid (GP). It analyzes the performance of each technique based on metrics like Peak Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and compression ratio. Experiments were conducted on biometric image databases to compress and reconstruct images using these four techniques, and the results were evaluated and compared based on the above-mentioned metrics.
Improving Performance of Multileveled BTC Based CBIR Using Sundry Color SpacesCSCJournals
The paper presents an extension of content based image retrieval (CBIR) techniques based on multilevel Block Truncation Coding (BTC) using nine sundry color spaces. Block truncation coding based features is one of the CBIR methods proposed using color features of image. The approach basically considers red, green and blue planes of an image to compute feature vector. This BTC based CBIR can be extended as multileveled BTC for performance improvement in image retrieval. The paper extends the multileveled BTC using RGB color space to other nine color spaces. The CBIR techniques like BTC Level-1, BTC Level-2, BTC Level-3 and BTC Level-4 are applied using various color spaces to analyze and compare their performances. The CBIR techniques are tested on generic image database of 1000 images spread across 11 categories. For each CBIR technique, 55 queries (5 per category) are fired on extended Wang generic image database to compute average precision and recall for all queries. The results have shown the performance improvement (ie., higher precision and recall values) with BTC-CBIR methods using luminance-chrominance color spaces (YCgCb, Kekre’s LUV, YUV, YIQ, YCbCr) as compared to non-luminance (RGB, HSI, HSV, rgb , XYZ) Color spaces. The performance of multileveled BTC-CBIR increases gradually with increase in level up to certain extent (Level 3) and then increases slightly due to voids being created at higher levels. In all levels of BTC Kekre’s LUV color space gives best performance
An Algorithm for Improving the Quality of Compacted JPEG Image by Minimizes t...ijcga
The Block Transform Coded, JPEG- a lossy image compression format has been used to keep storage and
bandwidth requirements of digital image at practical levels. However, JPEG compression schemes may
exhibit unwanted image artifacts to appear - such as the ‘blocky’ artifact found in smooth/monotone areas
of an image, caused by the coarse quantization of DCT coefficients. A number of image filtering
approaches have been analyzed in literature incorporating value-averaging filters in order to smooth out
the discontinuities that appear across DCT block boundaries. Although some of these approaches are able
to decrease the severity of these unwanted artifacts to some extent, other approaches have certain
limitations that cause excessive blurring to high-contrast edges in the image. The image deblocking
algorithm presented in this paper aims to filter the blocked boundaries. This is accomplished by employing
smoothening, detection of blocked edges and then filtering the difference between the pixels containing the
blocked edge. The deblocking algorithm presented has been successful in reducing blocky artifacts in an
image and therefore increases the subjective as well as objective quality of the reconstructed image.
This document proposes a multi-level block truncation code algorithm for RGB image compression to achieve low bit rates and high quality. The algorithm combines bit mapping and quantization by dividing images into blocks, calculating thresholds, quantizing thresholds, and representing blocks with bit maps. It was tested on standard images like flowers, Lena, and baboon. Results showed improved peak signal-to-noise ratio and mean squared error compared to existing methods, demonstrating the effectiveness of the proposed multi-level block truncation code algorithm for image compression.
Review of Diverse Techniques Used for Effective Fractal Image CompressionIRJET Journal
This document reviews different techniques for fractal image compression to enhance compression ratio while maintaining image quality. It discusses algorithms like quadtree partitioning with Huffman coding (QPHC), discrete cosine transform based fractal image compression (DCT-FIC), discrete wavelet transform based fractal image compression (DWTFIC), and Grover's quantum search algorithm based fractal image compression (QAFIC). The document also analyzes works applying these techniques and concludes that combining QAFIC with the tiny block size processing algorithm may further improve compression ratio with minimal quality loss.
Color image compression based on spatial and magnitude signal decomposition IJECEIAES
In this paper, a simple color image compression system has been proposed using image signal decomposition. Where, the RGB image color band is converted to the less correlated YUV color model and the pixel value (magnitude) in each band is decomposed into 2-values; most and least significant. According to the importance of the most significant value (MSV) that influenced by any simply modification happened, an adaptive lossless image compression system is proposed using bit plane (BP) slicing, delta pulse code modulation (Delta PCM), adaptive quadtree (QT) partitioning followed by an adaptive shift encoder. On the other hand, a lossy compression system is introduced to handle the least significant value (LSV), it is based on an adaptive, error bounded coding system, and it uses the DCT compression scheme. The performance of the developed compression system was analyzed and compared with those attained from the universal standard JPEG, and the results of applying the proposed system indicated its performance is comparable or better than that of the JPEG standards.
A Novel Image Compression Approach Inexact Computingijtsrd
This work proposes a novel approach for digital image processing that relies on faulty computation to address some of the issues with discrete cosine transformation DCT compression. The proposed system has three processing stages the first employs approximated DCT for picture compression to eliminate all compute demanding floating point multiplication and to execute DCT processing with integer additions and, in certain cases, logical right left modifications. The second level reduces the amount of data that must be processed from the first level by removing frequencies that cannot be perceived by human senses. Finally, in order to reduce power consumption and delay, the third stage employs erroneous circuit level adders for DCT computation. A collection of structured pictures is compressed for measurement using the suggested three level method. Various figures of merit such as energy consumption, delay, power signal to noise ratio, average difference, and absolute maximum difference are compared to current compression techniques an error analysis is also carried out to substantiate the simulation findings. The results indicate significant gains in energy and time reduction while retaining acceptable accuracy levels for image processing applications. Sonam Kumari | Manish Rai "A Novel Image Compression Approach-Inexact Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-6 , October 2022, URL: https://www.ijtsrd.com/papers/ijtsrd52197.pdf Paper URL: https://www.ijtsrd.com/engineering/electrical-engineering/52197/a-novel-image-compression-approachinexact-computing/sonam-kumari
Jpeg image compression using discrete cosine transform a surveyIJCSES Journal
Due to the increasing requirements for transmission of images in computer, mobile environments, the
research in the field of image compression has increased significantly. Image compression plays a crucial
role in digital image processing, it is also very important for efficient transmission and storage of images.
When we compute the number of bits per image resulting from typical sampling rates and quantization
methods, we find that Image compression is needed. Therefore development of efficient techniques for
image compression has become necessary .This paper is a survey for lossy image compression using
Discrete Cosine Transform, it covers JPEG compression algorithm which is used for full-colour still image
applications and describes all the components of it.
Post-Segmentation Approach for Lossless Region of Interest Codingsipij
This paper presents a lossless region of interest coding technique that is suitable for interactive telemedicine over networks. The new encoding scheme allows a server to transmit only a part of a compressed image data progressively as a client requests it. This technique is different from region scalable coding in JPEG2000 since it does not define region of interest (ROI) when encoding occurs. In the proposed method, the image is fully encoded and stored in the server. It also allows a user to select a ROI after the compression is done. This feature is the main contribution of research. The proposed coding method achieves the region scalable coding by using the integer wavelet lifting, successive quantization, and partitioning that rearranges the wavelet coefficients into subsets. Each subset that represents a local area in an image is then separately coded using run-length and entropy coding. In this paper, we will show the benefits of using the proposed technique with examples and simulation results.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications
NETWORK MEDIA ATTENTION AND GREEN TECHNOLOGY INNOVATION ijcsity
This paper will provide a novel empirical study for the relationship between network media attention and green technology innovation and examine how network media attention can ease financing constraints. It collected data from listed companies in China's heavy pollution industry and performed rigorous regression analysis, in order to innovatively explore the environmental governance functions of the media. It found that network media attention significantly promotes green technology innovation. By analyzing the inner mechanism further, it found that network media attention can promote green innovation by easing financing constraints. Besides, network media attention has a significant positive impact on green invention patents while not affecting green utility model patents.
More Related Content
Similar to A REVIEW ON IMAGE COMPRESSION USING HALFTONING BASED BTC
Review On Fractal Image Compression TechniquesIRJET Journal
This document reviews different techniques for fractal image compression. It discusses how fractal image compression works by dividing an image into range and domain blocks. It then summarizes several papers that propose techniques for fractal image compression using discrete cosine transform (DCT) or discrete wavelet transform (DWT). These techniques aim to improve compression ratio and reduce encoding time. Finally, the document proposes a new method that combines wavelets with fractal image compression to further increase compression ratio while maintaining low image quality losses during decompression.
International Journal on Soft Computing ( IJSC )ijsc
This document provides a summary of various wavelet-based image coding schemes. It discusses the basics of image compression including transformation, quantization, and encoding. It then reviews the wavelet transform approach and several popular wavelet coding techniques such as EZW, SPIHT, SPECK, EBCOT, WDR, ASWDR, SFQ, EPWIC, CREW, SR and GW coding. These techniques exploit the multi-resolution properties of wavelets for superior energy compaction and compression performance compared to DCT-based methods like JPEG. The document provides details on how each coding scheme works and compares their features.
This document summarizes an article that proposes modifications to the JPEG 2000 image compression standard to achieve higher compression ratios while maintaining acceptable error rates. The proposed Adaptive JPEG 2000 technique involves pre-processing images with a transfer function to make them more suitable for compression by JPEG 2000. This is intended to provide higher compression ratios than the original JPEG 2000 standard while keeping root mean square error within allowed thresholds. The document provides background on JPEG 2000 and lossy image compression techniques, describes the proposed pre-processing approach, and indicates it was tested on single-channel images.
This document summarizes a research paper that analyzed medical image compression using discrete cosine transform (DCT) with entropy encoding and Huffman coding on MRI brain images. The paper implemented DCT to compress images at varying quality levels and block sizes. It also used Huffman encoding to assign shorter bit codes to more frequent symbols. The research tested the algorithms on a set of brain MRI images. Results showed that compression ratio increased with higher quality levels, while peak signal-to-noise ratio and mean squared error varied based on the technique and block size used. The paper concluded that DCT with entropy decomposition can effectively compress MRI images with less quality loss.
Digital Image Compression using Hybrid Scheme using DWT and Quantization wit...IRJET Journal
This document discusses a hybrid image compression technique using both discrete cosine transform (DCT) and discrete wavelet transform (DWT). It begins with an introduction to image compression and its goals of reducing file size while maintaining quality. Next, it outlines the proposed hybrid compression method, which applies DWT to blocks of the image, then DCT to the approximation coefficients from DWT. This is intended to achieve higher compression ratios than DCT or DWT alone, with fewer blocking artifacts and false contours. Simulation results on various test images show the hybrid method provides higher PSNR and lower MSE than the individual transforms, demonstrating it outperforms them in terms of both quality and compression. The document concludes the hybrid approach is more suitable for
A Review on Image Compression using DCT and DWTIJSRD
This document reviews image compression techniques using discrete cosine transform (DCT) and discrete wavelet transform (DWT). It discusses how DCT transforms images from spatial to frequency domains, allowing for energy compaction and efficient encoding. DWT is a multi-resolution technique that represents images at different frequency bands. The document analyzes various studies that have used DCT and DWT for compression and compares their performance in terms of metrics like peak signal-to-noise ratio and compression ratio. It finds that DWT generally provides better compression performance than DCT, though DCT requires less computational resources. A hybrid DCT-DWT technique is also proposed to combine the advantages of both methods.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Feature Extraction in Content based Image Retrievalijcnes
This document discusses feature extraction methods for content-based image retrieval (CBIR). It describes Order Dither Block Truncation Coding (ODBTC), an image compression technique that can be used to extract image features without decoding. The proposed CBIR system extracts two features from ODBTC-encoded images: Color Co-occurrence Feature (CCF) and Bit Pattern Feature (BPF). CCF is extracted from color quantizers produced by ODBTC, while BPF is based on a bit pattern codebook generated from ODBTC bitmap images. Experimental results show this approach provides superior retrieval accuracy compared to earlier CBIR methods.
Comparison of different Fingerprint Compression Techniquessipij
The important features of wavelet transform and different methods in compression of fingerprint images have been implemented. Image quality is measured objectively using peak signal to noise ratio (PSNR) and mean square error (MSE).A comparative study using discrete cosine transform based Joint Photographic Experts Group(JPEG) standard , wavelet based basic Set Partitioning in Hierarchical trees(SPIHT) and Modified SPIHT is done. The comparison shows that Modified SPIHT offers better compression than basic SPIHT and JPEG. The results will help application developers to choose a good wavelet compression system for their applications.
BIG DATA-DRIVEN FAST REDUCING THE VISUAL BLOCK ARTIFACTS OF DCT COMPRESSED IM...IJDKP
1) The document proposes a new simple method to reduce visual block artifacts in images compressed using DCT (used in JPEG) for urban surveillance systems.
2) The method smooths only the connection edges between adjacent blocks while keeping other image areas unchanged.
3) Simulation results show the proposed method achieves better image quality as measured by PSNR compared to median and wiener filters, while using significantly less computational resources.
Performance Analysis of Compression Techniques Using SVD, BTC, DCT and GPIOSR Journals
This document summarizes and compares four different image compression techniques: Singular Value Decomposition (SVD), Block Truncation Coding (BTC), Discrete Cosine Transform (DCT), and Gaussian Pyramid (GP). It analyzes the performance of each technique based on metrics like Peak Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and compression ratio. The techniques are tested on biometric images from iris, fingerprint, and palm print databases to evaluate image quality after compression.
This document summarizes and compares four different image compression techniques: Singular Value Decomposition (SVD), Block Truncation Coding (BTC), Discrete Cosine Transform (DCT), and Gaussian Pyramid (GP). It analyzes the performance of each technique based on metrics like Peak Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and compression ratio. Experiments were conducted on biometric image databases to compress and reconstruct images using these four techniques, and the results were evaluated and compared based on the above-mentioned metrics.
Improving Performance of Multileveled BTC Based CBIR Using Sundry Color SpacesCSCJournals
The paper presents an extension of content based image retrieval (CBIR) techniques based on multilevel Block Truncation Coding (BTC) using nine sundry color spaces. Block truncation coding based features is one of the CBIR methods proposed using color features of image. The approach basically considers red, green and blue planes of an image to compute feature vector. This BTC based CBIR can be extended as multileveled BTC for performance improvement in image retrieval. The paper extends the multileveled BTC using RGB color space to other nine color spaces. The CBIR techniques like BTC Level-1, BTC Level-2, BTC Level-3 and BTC Level-4 are applied using various color spaces to analyze and compare their performances. The CBIR techniques are tested on generic image database of 1000 images spread across 11 categories. For each CBIR technique, 55 queries (5 per category) are fired on extended Wang generic image database to compute average precision and recall for all queries. The results have shown the performance improvement (ie., higher precision and recall values) with BTC-CBIR methods using luminance-chrominance color spaces (YCgCb, Kekre’s LUV, YUV, YIQ, YCbCr) as compared to non-luminance (RGB, HSI, HSV, rgb , XYZ) Color spaces. The performance of multileveled BTC-CBIR increases gradually with increase in level up to certain extent (Level 3) and then increases slightly due to voids being created at higher levels. In all levels of BTC Kekre’s LUV color space gives best performance
An Algorithm for Improving the Quality of Compacted JPEG Image by Minimizes t...ijcga
The Block Transform Coded, JPEG- a lossy image compression format has been used to keep storage and
bandwidth requirements of digital image at practical levels. However, JPEG compression schemes may
exhibit unwanted image artifacts to appear - such as the ‘blocky’ artifact found in smooth/monotone areas
of an image, caused by the coarse quantization of DCT coefficients. A number of image filtering
approaches have been analyzed in literature incorporating value-averaging filters in order to smooth out
the discontinuities that appear across DCT block boundaries. Although some of these approaches are able
to decrease the severity of these unwanted artifacts to some extent, other approaches have certain
limitations that cause excessive blurring to high-contrast edges in the image. The image deblocking
algorithm presented in this paper aims to filter the blocked boundaries. This is accomplished by employing
smoothening, detection of blocked edges and then filtering the difference between the pixels containing the
blocked edge. The deblocking algorithm presented has been successful in reducing blocky artifacts in an
image and therefore increases the subjective as well as objective quality of the reconstructed image.
This document proposes a multi-level block truncation code algorithm for RGB image compression to achieve low bit rates and high quality. The algorithm combines bit mapping and quantization by dividing images into blocks, calculating thresholds, quantizing thresholds, and representing blocks with bit maps. It was tested on standard images like flowers, Lena, and baboon. Results showed improved peak signal-to-noise ratio and mean squared error compared to existing methods, demonstrating the effectiveness of the proposed multi-level block truncation code algorithm for image compression.
Review of Diverse Techniques Used for Effective Fractal Image CompressionIRJET Journal
This document reviews different techniques for fractal image compression to enhance compression ratio while maintaining image quality. It discusses algorithms like quadtree partitioning with Huffman coding (QPHC), discrete cosine transform based fractal image compression (DCT-FIC), discrete wavelet transform based fractal image compression (DWTFIC), and Grover's quantum search algorithm based fractal image compression (QAFIC). The document also analyzes works applying these techniques and concludes that combining QAFIC with the tiny block size processing algorithm may further improve compression ratio with minimal quality loss.
Color image compression based on spatial and magnitude signal decomposition IJECEIAES
In this paper, a simple color image compression system has been proposed using image signal decomposition. Where, the RGB image color band is converted to the less correlated YUV color model and the pixel value (magnitude) in each band is decomposed into 2-values; most and least significant. According to the importance of the most significant value (MSV) that influenced by any simply modification happened, an adaptive lossless image compression system is proposed using bit plane (BP) slicing, delta pulse code modulation (Delta PCM), adaptive quadtree (QT) partitioning followed by an adaptive shift encoder. On the other hand, a lossy compression system is introduced to handle the least significant value (LSV), it is based on an adaptive, error bounded coding system, and it uses the DCT compression scheme. The performance of the developed compression system was analyzed and compared with those attained from the universal standard JPEG, and the results of applying the proposed system indicated its performance is comparable or better than that of the JPEG standards.
A Novel Image Compression Approach Inexact Computingijtsrd
This work proposes a novel approach for digital image processing that relies on faulty computation to address some of the issues with discrete cosine transformation DCT compression. The proposed system has three processing stages the first employs approximated DCT for picture compression to eliminate all compute demanding floating point multiplication and to execute DCT processing with integer additions and, in certain cases, logical right left modifications. The second level reduces the amount of data that must be processed from the first level by removing frequencies that cannot be perceived by human senses. Finally, in order to reduce power consumption and delay, the third stage employs erroneous circuit level adders for DCT computation. A collection of structured pictures is compressed for measurement using the suggested three level method. Various figures of merit such as energy consumption, delay, power signal to noise ratio, average difference, and absolute maximum difference are compared to current compression techniques an error analysis is also carried out to substantiate the simulation findings. The results indicate significant gains in energy and time reduction while retaining acceptable accuracy levels for image processing applications. Sonam Kumari | Manish Rai "A Novel Image Compression Approach-Inexact Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-6 | Issue-6 , October 2022, URL: https://www.ijtsrd.com/papers/ijtsrd52197.pdf Paper URL: https://www.ijtsrd.com/engineering/electrical-engineering/52197/a-novel-image-compression-approachinexact-computing/sonam-kumari
Jpeg image compression using discrete cosine transform a surveyIJCSES Journal
Due to the increasing requirements for transmission of images in computer, mobile environments, the
research in the field of image compression has increased significantly. Image compression plays a crucial
role in digital image processing, it is also very important for efficient transmission and storage of images.
When we compute the number of bits per image resulting from typical sampling rates and quantization
methods, we find that Image compression is needed. Therefore development of efficient techniques for
image compression has become necessary .This paper is a survey for lossy image compression using
Discrete Cosine Transform, it covers JPEG compression algorithm which is used for full-colour still image
applications and describes all the components of it.
Post-Segmentation Approach for Lossless Region of Interest Codingsipij
This paper presents a lossless region of interest coding technique that is suitable for interactive telemedicine over networks. The new encoding scheme allows a server to transmit only a part of a compressed image data progressively as a client requests it. This technique is different from region scalable coding in JPEG2000 since it does not define region of interest (ROI) when encoding occurs. In the proposed method, the image is fully encoded and stored in the server. It also allows a user to select a ROI after the compression is done. This feature is the main contribution of research. The proposed coding method achieves the region scalable coding by using the integer wavelet lifting, successive quantization, and partitioning that rearranges the wavelet coefficients into subsets. Each subset that represents a local area in an image is then separately coded using run-length and entropy coding. In this paper, we will show the benefits of using the proposed technique with examples and simulation results.
Similar to A REVIEW ON IMAGE COMPRESSION USING HALFTONING BASED BTC (20)
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications
NETWORK MEDIA ATTENTION AND GREEN TECHNOLOGY INNOVATION ijcsity
This paper will provide a novel empirical study for the relationship between network media attention and green technology innovation and examine how network media attention can ease financing constraints. It collected data from listed companies in China's heavy pollution industry and performed rigorous regression analysis, in order to innovatively explore the environmental governance functions of the media. It found that network media attention significantly promotes green technology innovation. By analyzing the inner mechanism further, it found that network media attention can promote green innovation by easing financing constraints. Besides, network media attention has a significant positive impact on green invention patents while not affecting green utility model patents.
4th International Conference on Artificial Intelligence and Machine Learning ...ijcsity
4th International Conference on Artificial Intelligence and Machine Learning (CAIML 2023) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Artificial Intelligence and Machine Learning. The Conference looks for significant contributions to all major fields of the Artificial Intelligence, Machine Learning in theoretical and practical aspects. The aim of the Conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
5th International Conference on Machine Learning & Applications (CMLA 2023)ijcsity
5th International Conference on Machine Learning & Applications (CMLA 2023) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
The document is a call for papers for the International Journal of Computational Science and Information Technology. The journal focuses on complex systems, information, and computation using mathematics and engineering techniques. Authors are invited to submit unpublished papers on topics related to algorithms, data structures, computational physics, machine learning, and more. The submission deadline is May 20, 2023, with notification of acceptance by June 20, 2023.
8th International Conference on Networks, Communications, Wireless and Mobile...ijcsity
8th International Conference on Networks, Communications, Wireless and Mobile Computing (NCWMC 2023) looks for significant contributions to the Computer Networks, Communications, wireless and mobile computing for wired and wireless networks in theoretical and practical aspects. Original papers are invited on computer Networks, network protocols and wireless networks, Data communication Technologies, network security and mobile computing. The goal of this Conference is to bring together researchers and practitioners from academia and industry to focus on advanced networking concepts and establishing new collaborations in these areas.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
10th International Conference on Computer Science and Information Technology ...ijcsity
10th International Conference on Computer Science and Information Technology (CoSIT 2023) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Computer Science, Engineering and Information Technology. The Conference looks for significant contributions to all major fields of the Computer Science and Information Technology in theoretical and practical aspects. The aim of the conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.
International Journal of Computational Science and Information Technology (IJ...ijcsity
The document is a call for papers for the International Journal of Computational Science and Information Technology (IJCSITY). IJCSITY focuses on complex systems, information, and computation using mathematics and engineering techniques. Authors are invited to submit unpublished papers on topics related to algorithms, data structures, computational physics/biology, cryptography, machine learning, and more. The submission deadline is April 22, 2023, with notification by May 22, 2023 and final manuscripts due by May 25, 2023.
International Conference on Speech and NLP (SPNLP 2023) ijcsity
International Conference on Speech and NLP (SPNLP 2023) will provide an excellent international forums for sharing knowledge and results in theory, methodology and applications of speech and Natural Language Processing (NLP).
International Conference on Speech and NLP (SPNLP 2023) will provide an excellent international forums for sharing knowledge and results in theory, methodology and applications of speech and Natural Language Processing (NLP).
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
International Journal of Computational Science and Information Technology (IJ...ijcsity
The document is a call for papers for the International Journal of Computational Science and Information Technology. The journal focuses on complex systems, information, and computation using mathematics and engineering techniques. Authors are invited to submit unpublished papers on topics related to algorithms, data structures, computational physics/biology, cryptography, machine learning, and more. The submission deadline is January 14, 2023, with notification by February 14 and final manuscripts due by February 22.
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discussthelatestissuesandadvancementintheareaofadvancedComputationanditsapplications
International Journal of Computational Science and Information Technology (IJ...ijcsity
International Journal of Computational Science and Information Technology (IJCSITY) focuses on Complex systems, information and computation using mathematics and engineering techniques. This is an open access peer-reviewed journal will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area of Computation theory and applications. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the area of advanced Computation and its applications.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Using recycled concrete aggregates (RCA) for pavements is crucial to achieving sustainability. Implementing RCA for new pavement can minimize carbon footprint, conserve natural resources, reduce harmful emissions, and lower life cycle costs. Compared to natural aggregate (NA), RCA pavement has fewer comprehensive studies and sustainability assessments.
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesChristina Lin
Traditionally, dealing with real-time data pipelines has involved significant overhead, even for straightforward tasks like data transformation or masking. However, in this talk, we’ll venture into the dynamic realm of WebAssembly (WASM) and discover how it can revolutionize the creation of stateless streaming pipelines within a Kafka (Redpanda) broker. These pipelines are adept at managing low-latency, high-data-volume scenarios.
Understanding Inductive Bias in Machine LearningSUTEJAS
This presentation explores the concept of inductive bias in machine learning. It explains how algorithms come with built-in assumptions and preferences that guide the learning process. You'll learn about the different types of inductive bias and how they can impact the performance and generalizability of machine learning models.
The presentation also covers the positive and negative aspects of inductive bias, along with strategies for mitigating potential drawbacks. We'll explore examples of how bias manifests in algorithms like neural networks and decision trees.
By understanding inductive bias, you can gain valuable insights into how machine learning models work and make informed decisions when building and deploying them.
ACEP Magazine edition 4th launched on 05.06.2024Rahul
This document provides information about the third edition of the magazine "Sthapatya" published by the Association of Civil Engineers (Practicing) Aurangabad. It includes messages from current and past presidents of ACEP, memories and photos from past ACEP events, information on life time achievement awards given by ACEP, and a technical article on concrete maintenance, repairs and strengthening. The document highlights activities of ACEP and provides a technical educational article for members.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
A REVIEW ON IMAGE COMPRESSION USING HALFTONING BASED BTC
1. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
DOI :10.5121/ijcsity.2016.4201 1
A REVIEW ON IMAGE COMPRESSION USING
HALFTONING BASED BTC
Meharban M.S1
and Priya S2
1
M.Tech Student, Dept. of Computer Science, Model Engineering College and
2
Associate Professor, Dept. of Computer Science, Model Engineering College
ABSTRACT
In this paper scrutinizes image compression using Halftoning Based Block Truncation Coding for color
image. Many algorithms were selected likely the original Block Truncation coding, Ordered Dither Block
Truncation Coding, Error Diffusion Block Truncation Coding , and Dot Diffused Block Truncation Coding .
These above techniques are divided image into non overlapping blocks. BTC acts as the basic compression
technique but it exhibits two disadvantages such as the false contour and blocking effect. Hence halftoning
based block truncation coding (HBTC) is used to overcome the two issues. Objective measures are used to
evaluate the image degree of excellence such as Peak Signal to Noise Ratio, Mean Square Error, Structural
Similarity Index and Compression Ratio. At the end, conclusions have shown that the Dot Diffused Block
Truncation Coding algorithm outperforms the Block Truncation Coding as well as Error Diffusion Block
Truncation Coding.
KEYWORDS
Halftoning, Image Compression, Block Truncation Coding (BTC), Error Diffusion, Dot Diffusion
1. INTRODUCTION
In recent years, the development of multimedia product is rapidly growing which contributes to
insufficient bandwidth of network and storage. Thus the theory of image compression gains more
importance for reducing the storage space and transmission bandwidth needed. Digital halftoning
is a technique for converting continuous-tone images into two-tone image [8]. The results can
resemble the original images when viewed from a distance by involving the low-pass nature of
the Human Visual System (HVS). Today, digital halftoning plays a key role in almost every
discipline that involves printing and displaying. All newspapers, magazines, and books are
printed with digital halftoning [15].For color image separate halftone is generated for cyan,
magenta, yellow and black .Each halftone screen is rotated to form a pat-tern called rosette.
Effective digital halftoning can substantially improve the quality of rendered images at minimal
cost [8].The major issues in choosing a halftoning technique are image quality and amount of
computation. Some of the major Halftoning method that has been developed so far includes the
ordered dithering, Error diffusion and Dot diffusion.
Block Truncation Coding (BTC) is a lossy image compression technique which uses moment
preserving quantization method for compressing digital gray scale images as well as color image.
BTC has been used for many years for compressing digital monochrome images. BTC has been
used for many years for compressing digital monochrome images. It is a simple and lossy image
Compression technique.
2. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
2
Fig .1 Enlarged details of halftone dot pattern for
(a) Grayscale image (b) Color image
The BTC method preserves the block mean and standard deviation [1].The simplest way to
extend BTC to color image is to apply BTC to each color plane independently[10][2].The
disadvantage of this method is that three bit plane are needed hence the compression ratios
achievable are low.
2. BLOCK TRUNCATION CODING
Block Truncation Coding (BTC) is a lossy image compression technique which uses moment
preserving quantization method for compressing digital gray scale images as well as color image
[2]. In block truncation coding (BTC), the original image is divided into fixed-size non
overlapping blocks of size M N [1] .The block size chosen is usually small to avoid the edge
blurring and blocking effect. Each block is independently coded using a two level (1-bit)
quantizer. The two values preserve the first and the second moment characteristic of the original
block. BTC does not provide a higher gain than any of the modern image compressing algorithms
like JPEG or JPEG-2000, but it is much lesser complex [5] .While BTC based image compression
method provide low computational complexity ,the method also has the issue of degradation of
the image quality when compared to other compression technique. However BTC based image
compression also suffer from two major issues namely blocking effect and false contours [6].
3. HALFTONING METHODS
Three common methods for generating digital halftoning images are
1. Dithering: Common technique used for generating digital halftoning images is dithering.
Dithering creates an output image with the same number of dots as the number of pixels
in the source image. Dithering can be thought of as thresholding the source image with a
dither matrix. The matrix is laid repeatedly over the source image. Wherever the pixel
value of the image is greater than the value in the matrix, a dot on the output image is
filled.
3. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
3
2. Error diffusion: Error diffusion is another technique used for generating digital half toned
images. It is often called spatial dithering. Error diffusion sequentially traverses each
pixel of the source image. Each pixel is compared to a threshold. If the pixel value is
higher than the threshold, a 255 is outputted; otherwise, a 0 is outputted. The error, the
difference between the input pixel value and the output value is dispersed to nearby
neighbors.
3. Dot diffusion: Dot diffusion method for halftoning, is an attractive method which
attempts to retain the good features of error diffusion while offering substantial
parallelism. The dot diffusion method for halftoning has only one design parameter
called the class matrix.
4. HALFTONING BASED BLOCK TRUNCATION CODING
Halftoning based block truncation coding is an extended compression technique derived from
BTC scheme in which the BTC bitmap image is replaced with halftone image. The main
difference between BTC and HBTC is on the image block quantizer determination .In contrast to
the BTC scheme which tries to maintain its mean value and standard deviation in image block.
The HBTC quantizer is simply obtained from the minimum and maximum value found in an
image block. Error diffusion based BTC offers an improved image quality. Ordered dithering
based BTC is used when the main requirement is performance efficiency .Dot diffusion based
method provide a balance of the above two requirements, better visual quality as well as a better
performance efficiency [5][9][10].The three methods described above all provide a proper
solution to the problems in a traditional BTC, such us blocking effect and false contours.
4.1. Ordered Dither Block Truncation Coding
The dithering-based BTC, namely Ordered Dither Block Truncation Coding (ODBTC) is an
example of HBTC. In which bit pattern configuration of the bitmap is generated from the
dithering approach (void-and-cluster Halftoning) .In encoding stage, the ODBTC scheme utilizes
the dither array Look-Up-Table (LUT) to speed up the processing speed. The dither array in
ODBTC method substitutes the fixed average value as the threshold value for the generation of
bitmap image [6]. The extreme values in ODBTC are simply obtained from the minimum and
maximum value found in the image blocks. ODBTC offers high efficiency and low computational
complexity. The quantization error cannot be compensated with the ordered dithering halftoning
and thus the ODBTC yields lower image quality compared to that of the EDBTC.
4. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
4
Fig . 2. Block Diagram for ODBTC
4.2 Error Diffusion Block Truncation Coding
Error diffusion is a method that provides better visual quality of image .Error diffusion based
BTC does this while also compressing the image .In this method, the inherent dithering property
of error diffusion is to deal with the problem of false contour. Similar to the BTC scheme,
EDBTC looks for a new representation (two quantizer and bitmap image) for reducing the storage
requirement. The EDBTC bitmap image is constructed by considering the quantized error which
diffuses to the nearby pixels to compensate the overall brightness .EDBTC employs the error
kernel to generate the representative bitmap image. Fig 4 , 5 shows the error diffusion kernels for
Floyd-Steinberg, Stucki, Jarvis, and Stevenson .Different error kernel yield different halftoning
pattern. Error diffusion strategy effectively removes the annoying blocking effect and false
5. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
5
contour, while maintaining the low computational complexity. The prolonged processing time is
still an issue [5] [10].
Fig. 4 Error Kernels (a) Floyd-Steinberg (b) Stucki
Fig. 5 .Error Kernels (c) Jarvis (d) Sierra
6. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
6
Fig.
5.Image quality comparison on different EDBTC kernel
(a) Original (b) Floyd (c) Stuck
Fig. 6 Algorithm for error diffusion BTC
4.3. Dot Diffused Block Truncation Coding
Dot diffusion based BTC provides better visual quality and performance efficiency compared to
Error diffusion based BTC [4].Here the natural parallelism of dot diffusion is utilized to obtain
better processing efficiency .Better image quality is achieved by co-optimizing the diffused
matrix and class matrix of the dot diffusion. It also provides better image quality compared to
Ordered Dithering based BTC as well. The DDBTC effectively compresses an image by
decomposing an image into two quantizers and a bitmap image. The DDBTC diffuses the
quantization error of the current processed pixel into its neighboring pixels using the diffused
matrix and class matrix concurrently to generate the bitmap image .High dynamic range can
easily destroy the blocking effect and false contour.
Class Matrix of size 4*4
Diffused Matrix
8 11 6 1
14 2 4 12
5 9 7 3
13 5 0 10
.2716 1 .2716
1 X 1
.2716 1 .2716
7. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
7
5. EXPERIMENTAL RESULT
5.1. Objective quality measures comparison
Objective measures are used to evaluate the image degree of excellence such as Peak Signal to
Noise Ratio, Mean Square Error, Compression Ratio and Structural Similarity Index
(SSIM).The PSNR is the ratio between a signal’s maximum power and the power of the signal’s
noise. By using the PSNR values the quality of reconstructed images is best described.
Mathematically, PSNR is defined as:
PSNR=10* log 10 [
f (m ,n)− g(m,n)
¿
¿2
¿
¿
∑ ¿
1
M∗N ∑ ¿
2552
¿
] (1)
Mean square error is a very useful measures as it gives an average value of the energy lost in the
lossy compression of the original image .A very small MSE means the image is very closer to the
original, f(m,n) is the original image ,g(m,n) is the reconstructed image MSE is defined as.
MSE=
f (m ,n)− g(m,n)
¿
¿2
¿
¿
∑ ¿
1
M∗N
∑ ¿
) (2)
The compression ratio is used to measure the ability of data compression by comparing the size of
the image being compressed to the size of the original image.
Rate is another metric used to evaluate the performance of the compression algorithm. Rate gives
the number of bits per pixel used to encode an image rather than abstract percentage.
Another category of image quality measures is based on the assumption that the human visual
system is highly adapted to extract structural information from the viewing field [10]. The error
sensitivity approach estimates perceived errors to quantify image degradations, while this
approach considers image degradations as perceived structural information variation. The
structural similarity index (SSIM) can be calculated as a function of three components:
luminance, contrast and structure.
8. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
8
SSIM(x; y) ¿[l(x; y)]
α
[c(x; y)]
β
[s(x; y)]
γ
Fig .7 Objective image quality comparisons
Table 1:Objective Image Quality Comparison
Type of BTC RMSE PSNR Compression Ratio SSIM
BTC 7.52 29.03 0.8607 0.8734
ODBTC 7.13 30.13 1.03611 0.93245
EDBTC 6.17 31.52 1.73611 0.94076
DDBTC 5.27 38.03 2.3000 0.95518
The table illustrates the PSNR, MSE, SSIM and Compression Ratio values obtained for different
halftoning based BTC. From the table as well as through visual inspection of the result images, it
can be seen that Dot Diffused BTC provide better visual quality compared to other halftoning
based BTC.
Fig.8.Image quality comparison on different HBTC
(a)original,(b)BTC,(c)ODBTC,(d)EDBTC,(e)DDBTC
9. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
9
6. CONCLUSION
In this literature survey, image compression using Halftoning based Block Truncation coding for
color image has been scrutinized, which can provide an excellent image quality and artifact free
result such us inherent blocking effect and false contour artifact of the traditional BTC
simultaneously. Four algorithms were selected specifically, the original block truncation coding
(BTC), Ordered dither Block truncation coding (ODBTC), Error diffused block truncation coding
(EDBTC) and Dot diffused Block truncation coding (DDBTC). Objectives measures are used to
evaluate the image degree of excellence such as PSNR, MSE, SSIM and Compression Ratio, In
this survey find out that halftoning based BTC not only applied for gray scale image it can be
extended for color image .For future study, other color spaces can be explored for the image
compression such as YCbCr, Lab color space, HSI etc.
REFERENCES
[1] E.J Delp and O.R .Mitchell. Image coding using block truncation coding. IEEE Transactions on
Image Processing, 27(1):1335–1342, Sept 1979.
[2] G. Qiu. Color Image Indexing Using BTC IEEE Transactions on Image Processing, 12(1) , Jan. 2003.
[3] J.M Guo High efficiency ordered dither block truncation with dither array LUT and its scalable
coding application IEEE Transactions on Image Processing, 20(1):97–110, Jan 2010.
[4] J.M Guo and Y.F.Liu. Improved Block Truncation Coding using Optimized Dot Diffusion. IEEE
Transactions on Image Processing, 2(1):1269–1275, Jan 2014.
[5] J.M Guo and Y.F.Liu. Improved Block Truncation Coding using modified error diffusion. IEEE
Transactions on Image Processing, 44(7):462–464, Mar 2008.
[6] J.M Guo and Y.F.Liu. Improved Block Truncation Coding Based on the Void-and-Cluter Dithering
Approach. IEEE Transactions on Image Processing, 18(1):211–213, Jan 2009.
[7] David Saloman. Data Compression the Complete Reference. Fourth Edition.
[8] R Ulichney. Digital Halftoning. Cambridge,USA:: MIT Press,1987
[9] M.Kamel, C.T.Sun and G.Lian. Image compression by variable block truncation coding with optimal
threshold. IEEE Transactions on Signal Processing, 39(1):208–212, Jan 1991.
[10] Jing-Ming Guo and Yun-Fu Liu. High Capacity Data Hiding for Error-Diffused Block Truncation
Coding.. IEEE Transactions on Signal Processing, 21(12):4808–4818, December 2012.
[11] S.Vimala, P.Uma, B. Abidha. Improved Adaptive Block Truncation Coding for Image Compression
International Journal of Computer Applications, 19(7):975–988, April 2011 .
[12] M. D. Lema and O. R. Mitchell. High Absolute moments block truncation coding and its application
to color images, IEEE Trans. Commun, 4(32):1148–1157, Oct. 1984
[13] V. R. Udpikar and J. P. Raina. High BTC image coding using vector quantization, IEEE Trans.
Commun, 4(35):352–358, Sept. 1987.
[14] H. R. Kang Digital Color Halftoning New York: 1999.
[15] S.Vimala, P.Uma, B. Abidha. Improved Adaptive Block Truncation Coding for Image Compression
International Journal of Computer Applications, 19(7):975–988, April 2011
[16] V. R. Udpikar and J. P. Raina. High BTC image coding using vector quantization, IEEE Trans.
Commun, 4(35):352–358, Sept. 1987.
[17] P. Franti, and T. Kaukoranta, Binary vector quantizer design using soft centroids, Signal Proc Image
Comm , vol. 14, no. 9, pp. 677-681, 1999.
[18] M.Kamel,C.T.Sun and G.Lian. Image compression by variable block truncation coding with optimal
threshold. IEEE Transactions on Signal Processing, 39(1):208–212, Jan 1991.
[19] Jing-Ming Guo and Yun-Fu Liu. High Capacity Data Hiding for Error-Diffused Block Truncation
Coding. IEEE Transactions on Signal Processing, 21(12):4808–4818, December 2012.
[20] S.Vimala, P.Uma, B. Abidha. Improved Adaptive Block Truncation Coding for Image Compression.
10. International Journal of Computational Science and Information Technology (IJCSITY) Vol.4,No.2,May 2016
10
International Journal of Computer Applications, 19(7):975–988, April 2011
[21] C. S. Huang and Y. Lin. Hybrid block truncation coding, IEEE Trans. Signal Process, 4(12):328–330,
Dec 1997
[22] M. D. Lema and O. R. Mitchell. High Absolute moment blocks truncation coding and its application
to color images, IEEE Trans. Commun, 4(32):1148–1157, Oct. 1984.
[23] H. R. Kang. Digital Color Halftoning, New York: 1999.
[24] Smen Forchhammer and Kim S. Jensen. Data Compression of Scanned Halftone Images, IEEE Trans.
Commun, 4(42):213–238, Mar. 1997.
[25] Arup Kumar Pal. An efficient codebook initialization approach for LBG algorithm, International
Journal of Computer Science, Engineering and Applications (IJCSEA), Vol.1, No.4, August 2011.
Authors
Meharban M S was born in Perumbavoor in 1993. She received BE in Information
technology from Cochin University Of science and technology in 2014. She is currently
an M.Tech Candidate in computer science from model engineering college, Thrikkakara
in 2016. She is working as part time faculty at IGNOU.
Dr. Priya S Obtained her BTech in Computer Science & Engineering from Kerala
University, MTech in Computer Science & Engineering from Pondicherry University,
Pondicherry India and PhD in Information & Communication Engineering from Anna
University, Chennai, India. Her experience as a faculty is more than 18 years as of now.
She is a life member of ISTE.