Upcoming SlideShare
×

# Compression one example

374 views

Published on

• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

### Compression one example

1. 1. AgendaLZW ExampleHuffman ExampleShanon Fano ExampleArithmetic Coding ExampleAdvantages for Every method
2. 2. BILLGATES1BB B BILLGATES1 2B IBI I ILLGATES1 2 3B I LBIL L LLGATES 1 2 3 B I LBIL3 L LGATES 1 2 3 4 B I L GBIL3G G GATES 1 2 3 4 5 B I L G ABIL3G A A ATES
3. 3. 1 2 3 4 5 6B I L G A TBIL3G AT T TES1 2 3 4 5 6 7B I L G A T EBIL3G ATE E ES1 2 3 4 5 6 7 8B I L G A T E SBIL3G ATES S S
4. 4. Advantages and Disadvantages ofLZWAdvantage: Is a lossless compression algorithm.Hence no informationis lost.One need not pass the code table between the twocompression and the decompression.Simple ,fast and good compressionDisadvantage:What happens when the dictionary becomes too largeOne approach is to throw the dictionary away when itreaches a certain sizeUseful only for a large amount of test data whereredundancy is high
5. 5. Shannon–FanoAlgorithmBILLGATESSymbols B I L G A T E SProbabilities 1/9 1/9 2/9 1/9 1/9 1/9 1/9 1/9
6. 6. L B I G A T E SA L B I G A T E SB
7. 7. For a given list of symbols, develop acorresponding list of probabilities orfrequency counts so that eachsymbol’s relative frequency ofoccurrence is known.Sort the lists of symbols according tofrequency, with the most frequentlyoccurring symbols at the left and theleast common at the right.
8. 8. Divide the list into two parts,with the total frequencycounts of the left part beingas close to the total of theright as possible.
9. 9. C L B I G A T S E
10. 10. D L B I G A T S E
11. 11. E L B I G A T S E
12. 12. 1 0 1 0 1 0F 0 1 0 1 0 1 0 1 L B I G A T S E
13. 13. SymbolsL : 00B: 01I : 100G: 101A: 1100T: 1101E: 1110S: 1111
14. 14. 2Bit(1+1)+3Bit*(1+1)+4Bit*(1+1+1+1)/9=2.888888Bits/symbol
15. 15. What are the disadvantages ofshanon fano coding?What is better shannon fanoand huffman coding?huffman has a bettercompression rate.
16. 16. HuffmanAlgorithmBILLGATESSymbols B I L G A T E SProbabilities 1/9 1/9 2/9 1/9 1/9 1/9 1/9 1/9
17. 17. G A T E S LA B I B I G A S T E LB
18. 18.  Create a leaf node for each symbol and add it to frequency of occurrence. While there is more than one node in the queue: Remove the two nodes of lowest probability or frequency from the queue Prepend 0 and 1 respectively to any code already assigned to these nodes Create a new internal node with these two nodes as children and with probability equal to the sum of the two nodes probabilities. Add the new node to the queue. The remaining node is the root node and the tree is complete.
19. 19. B I G A T E L S
20. 20. 1 1 0 0 1 00 0 1 1 0 1 1 0B I G A T E L S
21. 21. Symbols :B: 00I:01G:100A:101T:1100E:1101L:1110S:1111
22. 22. 2Bit*(1+1)+3Bit*(1+1)+4Bit*(1+1+1+1)/9=2.888888 Bits/symbol
23. 23. Advantages anddisadvantages ofHuffmanThis compression algorithm is mainlyefficient in compressing text orprogram files. Images like they areoften used in prepress are betterhandled by other compressionalgorithms.
24. 24. What are the advantages of Huffmancoding?Algorithm is easy toimplementProduce a losslesscompression of images
25. 25. Adaptive Huffman Coding(Dynamic HuffmanCoding) Advantage: Source is encoded in real time. Disadvantage: More chance for transmission error.
26. 26. Arithmetic Coding
27. 27. What are the Advantages ofarithmetic coding over Huffmancoding? 1.the compression ratio is higher compared to huffman coding. 2.efficiency is greater comparatively. 3.Redundancy is much reduced.
28. 28. THANK YOU