Like this document? Why not share!

No Downloads

Total Views

2,067

On Slideshare

0

From Embeds

0

Number of Embeds

0

Shares

0

Downloads

51

Comments

0

Likes

1

No embeds

No notes for slide

- 1. INFORMATION THEORY AND CODING Nitin Mittal Head of Department Electronics and Communication Engineering Modern Institute of Engineering & Technology Mohri, Kurukshetra BHARAT PUBLICATIONS 135-A, Santpura Road, Yamuna Nagar - 135001
- 2. © Reserved with the publisher All rights reserved. No part of this publication may be reproduced, stored in a retrievalsystem or transmitted, in any form or by any means, electronic, mechanical, photocopying, recordingor otherwise, without the prior permision of the publisher. Dedicated to our Beloved ParentsFirst Edition: Aug. 2009Second Edition: June 2010Price: Rs. 200.00ISBN-13: 978-81-909129-3-8Publishing by: Bharat Publications, 135 -A, Santpura Road, Yamuna Nagar - 135001 Phone: 01732-227178, 232178, 94162-27140Laser setting by: Chanda Computers, 9896471565Legal Warning: The publisher has taken all possible precautions in publishing this book, in spite of all this efforts some errors might have crept in. Any mistake, erroror discrepancy noted may be brought to our knowledge which shall be taken care of in the next edition. It is notified that neither the publisher nor the book seller or the author willbe responsible for any damage or loss of action to anyone, of any kind, in any manner, therefrom. For missing pages or misprints the publisher takes responsibility to exchangewithin 15 days of purchase of the same edition. All costs in this connection are to be borne by the purchaser.
- 3. PREFACE The present volume is the outcome of my experience in teaching of the informationtheory to the undergraduate classes for the last few years and the exposure to the problemsfaced by the students in grasping the abstract nature of the subject. This experience is thefoundation and, I hope, the strength of the text. Earnest efforts have been exerted to presentthe subject matter in a well–knit manner so as not only to stimulate the interest of the studentsbut also to provide them with an insight into the complexities of a subject of great intrinsicbeauty. The book is intended to serve as a text for undergraduate students especially thoseopting for a course in Electronics and Communication Engineering. However, post graduatestudents will find it equally useful. This book offers a comprehensive review of the Information Theory and Coding. Themain text can be divided into four sections on Probability Theory, Information Theory, SourceCoding and Error Control Coding. Fairly sufficient ground has been covered in all the foursections. Information theory is the study of achievable bounds for communication and is largelyprobabilistic and analytic in nature. Coding theory then attempts to realize the promise of thesebounds by models which are constructed through mainly algebraic means. Different conceptshave been explained with the help of examples. A large number of problems with solutions havebeen provided to assist one to get a firm grip on the ideas developed. There is a plenty of scopefor the reader to try and solve problems at his own in the form of exercises. I am deeply indebted to all those authors whose research paper on InformationTheory and Coding influenced my learning of the subject and take this opportunity to expressmy sincere gratitude to them. I am thankful to Dr. Rajesh Goel (Principal, MIET, Mohri);Mr. R.S. Chauhan (Assistant Prof., JMIT, Radaur); Mr. Vikas Mittal (Sr. Lect. in HEC, Jagadhri)and Mr. Amanjeet Panghal (Lect. in MIET, Mohri) for motivation and continuous encouragementduring the preparation of this manuscript. I also wish to thank my collegues and friends whohave given many valuable suggestions on the scope and contents of the book. I am also indebtedto M/s Bharat Publications, Yamuna Nagar for bringing out the book in the short time. It is my earnest belief that no work is ever complete till it has had its share ofcriticism and hence Ill be too glad to receive comments and suggestions for the betterment ofthe book. Author (v)
- 4. FORWARD It is great honour and immense pleasure for me to write a foreward of a book onInformation Theory and Coding by one of my esteemed Colleagues, Mr. Nitin Mittal. Considering the needs of engineering students and the fact that they hardly get anyexposure to translate technology into practical applications, a basic knowledge in InformationTheory and Coding is essential and to be considered as a main subject in Electronics andCommunication Engineering. To cover the course material for such a vast and wide field, acomprehensive and easy to understand approach to the subject is required. In this book, theauthor has tried to put maximum efforts in this direction. The matter has been presented in wellstructured manner, an easy to understand language which can be grasped easily by students ofdifferent disciplines. Attention has also been paid to the fact that the text as well as diagrams could bereproduced by the students in theory examinations. A number of review questions given at theend of each chapter will further enhance the understanding of basic concepts. I am sure that this book would be quite useful to the students at undergraduate level invarious institutions, along with post graduate aspirants as well. With my best wishes to the author. Dr. RAJESH GOEL Principal, MIET, Mohri Kurukshetra (vi)
- 5. CONTENTSChapter 1. Probability Theory And Random Variables 1–521.1 Introduction 11.2 Probability Theory 2 1.2.1 Experiment 2 1.2.2 Sample Space And Events 2 1.2.3 Algebra of Events 31.3 Probability of Events 4 1.3.1 Properties of Probability 41.4 Conditional Probability 6 1.4.1 Conditional Probability of Independent Events 7 1.4.2 Bayes’ Formula 71.5 Random Variables 13 1.5.1. Discrete Random Variables 13 1.5.2. Continuous Random Variables 141.6 Probability Distribution of A Discrete Random Variable 141.7 Cumulative Distribution Function (CDF) 15 1.7.1 Properties of Cumulative Distribution Function 161.8 Probability Density Function (PDF) 17 1.8.1 Properties of Probability Density Function 171.9 Two – Dimensional Random Variables 20 1.9.1 Joint Distribution Function 20 1.9.2 Marginal Distribution Function 21 1.9.3 Independent Random Variables 21 1.9.4 Joint Probability Density Function 21 1.9.5 Marginal Probability Density Function 22 1.9.6 Conditional Probability Density Function 221.10 Functions of Random Variables 241.11 Statistical Averages of Random Variables 26 1.11.1 Expectation 26 1.11.2 Moments And Variance 27 1.11.3 Covariance And Correlation Coefficient 281.12 Some Important Distributions 28 1.12.1 The Uniform or Rectangular Distribution 28 1.12.2 The Exponential Distribution 29 1.12.3 Gaussian or Normal Distribution 30 1.12.4 Rayleigh Distribution 32 (vii)
- 6. 1.13. Characteristic Transformation Functions of Random Variables 34 1.13.1 Properties of Moment Generating Function 35 1.13.2 Determination of Statistical Averages Using MGF 361.14 Convergence of A Sequence of Random Variables 37 1.14.1 Law of Large Numbers 37 1.14.2 Central Limit Theorem 38Chapter 2. Random Processes 53–862.1 Introduction 532.2. Random Processes 542.3 Statistical Averages of Random Process 55 2.3.1 Ensemble Averages 55 2.3.2 Time Averages 562.4 Stationary Random Process 57 2.4.1 Strictly Stationary Process 57 2.4.2 Wide Sense Stationary Process 582.5 Ergodic Process 58 2.5.1 Properties of Ergodic Random Process 592.6 Correlation Function 60 2.6.1 Auto-Correlation Function 61 2.6.2 Cross-Correlation Function 62 2.6.3 Auto Covariance Function 63 2.6.4 Cross Covariance Function 632.7 Spectral Densities 64 2.7.1 Power Spectral Density 65 2.7.2 Cross Power Spectral Density 67 2.7.3 Energy Spectral Density 672.8 Response of Linear Systems To The Input Random Processes 692.9 Special Classes of Random Processes 73 2.9.1 Gaussian Process 73 2.9.2 Markov Process 74 2.9.3 Poisson Process 75 2.9.4 White Noise or White Process 76 2.9.5. Band - Limited White Noise or Process 77Chapter 3. Elements of Information Theory 87–1343.1. Introduction 873.2. Information Sources 883.3. Information: A Measure of Uncertainty 883.4 Average Information or Entropy 89 3.4.1. Properties of Entropy 913.5 Binary Sources 943.6 Extension of A Discrete Memoryless Source 95 (viii)
- 7. 3.7 Measure of Information For Two - Dimensional Discrete Finite 96 Probability Scheme 3.7.1 Discrete Memoryless Channels 983.8 Noise Characteristics of A Channel 1013.9 Measure of Mutual Information 102 3.9.1 Relationship Among Various Entropies 103 3.9.2 Mutual Information 103 3.9.3 Properties of Mutual Information 1043.10 Channel Capacity 1073.11 Channel Capacity of Binary Noise Structures 107 3.11.1 Channel Capacity of A BSC (Binary Symmetric Channel) 108 3.11.2 Channel Capacity of A BEC (Binary Erasure Channel) 1093.12 Differential Entropy And Mutual Information For Continuous Signals 1103.13 Shannon’s Theorem On Coding For Memoryless Noisy Channel 113Chapter 4. Source Encoding 135–1844.1 Introduction 1354.2 Source Encoding 1364.3 Basic Properties of Codes 1374.4 Separable Binary Codes 1394.5 Shannon – Fano Encoding 1414.6 Noiseless Coding Theorem 1444.7 Theorem of Decodability 1494.8 Average Length of Encoded Messages 1504.9 Shannon’s Binary Encoding 1524.10 Fundamental Theorem of Discrete Noiseless Coding 1544.11 Huffman’s Minimum – Redundancy Code 156Chapter 5. Error Control Coding For Digital 185–206 Communication System5.1 Introduction 1855.2 Elements of Digital Communication System 1865.3 Mathematical Models For Communication Channels 1925.4 Channel Codes 1945.5 Modulation And Coding 1965.6 Maximum Likelihood Decoding 2005.7 Types of Errors 2025.8 Error Control Strategies 203Chapter 6. Error Detection And Correction 207–2246.1 Introduction 2076.2 Types of Errors 208 (ix)
- 8. 6.3 Error Detection 209 6.3.1 Parity Check 210 6.3.2 Cyclic Redundancy Check (CRC) 211 6.3.3 Checksum 2136.4 Error Correction 215 6.4.1 Single – Bit Error Correction 215 6.4.2. Burst Error Correction 219Chapter 7. Field Algebra 225–2547.1 Introduction 2257.2 Binary Operations 2257.3 Groups 227 7.3.1. Commutative Group 227 7.3.2. Semi – Group 228 7.3.3. Order of A Group 228 7.3.4. Addition Modulo M 228 7.3.5. Multiplication Modulo M 228 7.3.6. General Properties of Groups 2307.4 Fields 230 7.4.1 Characteristics of The Field 2347.5 Binary Field Arithmetic 234 7.5.1 Irreducible Polynomial Over GF (2) 236 7.5.2 Primitive Polynomial Over GF (2) 2377.6 Construction of Galois Field GF (2m) 2397.7 Basic Properties of Galois Field GF (2m) 2437.8 Vector Spaces 2467.9 Matrices 249Chapter 8. Linear Block Codes 255–2828.1 Introduction 2558.2 Repetition Code 256 8.2.1 Majority Voting Decoder 256 8.2.2 Single Error Correcting Repetition Code 256 8.2.3 Advantages And Disadvantages of Repetition Codes 2578.3 Binary Block Codes 2578.4 Linear Block Code 258 8.4.1 Systematic Linear Block Code 259 8.4.2 Encoder For Linear Block Code 262 8.4.3 Parity – Check Matrix 2638.5 Syndrome Calculation For Linear Block Code 264 8.5.1 Properties of The Syndrome (S) 2688.6 The Hamming Distance of A Block Code 2708.7 Error – Detecting And Correcting Capabilities 271 (x)
- 9. 8.8 Syndrome Decoding of Linear Block Code 2738.9 Burst Error Correcting Block Codes 2758.10 Other Important Block Codes 277 8.10.1 Hamming Codes 277 8.10.2 Extended Codes 278Chapter 9. Cyclic Codes 283–3089.1 Introduction 2839.2 Cyclic Codes 2849.3 Generator Polynomial of Cyclic Codes 2859.4 Parity – Check Polynomial of Cyclic Codes 2869.5 Systematic Cyclic Codes 2889.6 Generator And Parity – Check Matrices of Cyclic Codes 2909.7 Encoder For Cyclic Codes 2929.8 Syndrome Polynomial Computation 2959.9 Decoding of Cyclic Codes 2979.10 Error – Trapping Decoding 2999.11 Advantages And Disadvantages of Cyclic Codes 3019.12 Important Classes of Cyclic Codes 301Chapter 10. BCH Codes 309–33810.1 Introduction 30910.2 Binary BCH Codes 31010.3 Generator – Polynomial of Binary BCH Codes 31010.4 Parity – Check Matrix of BCH Code 31410.5 Encoding of BCH Codes 31610.6 Properties of BCH Codes 31810.7 Decoding of BCH Codes 318 10.7.1 Syndrome Computation 318 10.7.2 Error Location Polynomial 32010.8 BCH Decoder Architecture 321 10.8.1 Peterson’s Direct Algorithm 322 10.8.2 Berlekamp’s Iterative Algorithm 326 10.8.3. Chien Search Algorithm 33210.9 Non - Primitive BCH Code 33310.10 Non – Binary BCH Codes And RS Codes 334Chapter 11. Convolutional Codes 339–37611.1 Introduction 33911.2 Convolutional Codes 34011.3 Convolutional Encoder 341 11.3.1 Encoding Using Discrete Convolution 342 11.3.2 Encoding Using Generator Matrix 344 (xi)
- 10. 11.4. Structural Properties of Convolutional Codes 346 11.4.1. Code – Tree Diagram 346 11.4.2 Trellis Diagram 348 11.4.3 State Diagram Representation 34911.5 Decoding of Convolutional Code 350 11.5.1 Maximum - Likelihood Decoding 350 11.5.2 The Viterbi Decoding Algorithm 352 11.5.3 Sequential Decoding of Convolutional Codes 35611.6 Distance Properties of Convolutional Codes 35711.7 Burst Error Correcting Convolutional Codes 359Chapter 12. Basic ARQ Strategies 377–38812.1 Introduction 37712.2 Automatic Repeat Request (ARQ) 37812.3 Stop-And-Wait ARQ 37912.4 Continuous ARQ 381 12.4.1 Go-Back-N ARQ 381 12.4.2. Selective Repeat ARQ 38312.5 Hybrid ARQ 384Chapter 13. Cryptography 389–40413.1 Introduction 38913.2 Cryptography 390 13.2.1 Need of Cryptography 390 13.2.2 Cryptographic Goals 390 13.2.3 Evaluation of Information Security 39113.3 Cryptography Components 39213.4 Symmetric Key Cryptography 393 13.4.1 Symmetric Key encryption / decryption 394 13.4.2 Techniques for coding plain text to chiper text 394 13.4.3 Advantages and disadvantages of symmetric key cryptography 39613.5 Asymmetric Key Cryptography 396 13.5.1 Public Key encryption / decryption 397 13.5.2 Conversion of plain text to chiper text algorithms 398 13.5.3 Advantages and disadvantages of public-key cryptography 40013.6 Comparison between symmetric and public-key cryptography 40113.7 Cryptography Applications 401Sample Model Papers 405–407References 40 8Index 409–412 (xii)
- 11. REFERENCES1. A. Bruce Carlson, Janet C. Rutledge and Crilly, “Communication Systems”, 3rd Ed., Mc Graw Hill, 2002.2. A. Papoulis, “Probability, Random Variables and Stochastic Processes”, Mc Graw Hill, 1991.3. Andrew S Tanenbaum, “Computer Networks”, 3rd ed., Upper Saddle River, NJ: Prentice Hall, 1996.4. B. P. Lathi, “Modern Digital and Analog Communication Systems”, Third Edition, Oxford Press, 2007.5. B. R. Bhat, “Modern Probability Theory”, New Age International Ltd, 1998.6. Behrouz A Forouzan, “Data Communication and Networking”, Tata McGraw-Hill, 2003.7. D. Stinson, “Cryptography: Theory and Practice”, CRC Press, Second edition, 2000.8. Das, Mullick and Chatterjee, “Digital Communication”, Wiley Eastern Ltd., 1998.9. Fazlollah M. Reza, “An Introduction to Information Theory”, Dover Publications, Inc., New York, 1994.10. Gregory Karpilovsky, “Field Theory: Classical Foundations and multiplicative groups”, Tata McGraw-Hill, 1988.11. Herbert Taub and Donald L Schilling, “Principles of Communication Systems”, 3rd Edition, Tata McGraw Hill, 2008.12. J. E. Pearson, “Basic communication theory”, Upper Saddle River, NJ: Prentice Hall, 1992.13. John G. Proakis, Masoud Salehi, “Fundamentals of Communication Systems”, Pearson Education, 2006.14. R. E. Blahut, “Principles and Practice of Information Theory”, Addison-Wesley, Reading, Mass, 1987.15. R. P Singh and S. D. Sapre, “Communication Systems – Analog and Digital”, Tata McGraw Hill, 2nd Edition, 2007.16. R. M.Gray, L. D Davission, “Introduction to Statistical Signal Processing”, Web Edition, 1999.17. Robert G. Gallanger, “Information Theory and Reliable Communication”, Mc Graw Hill, 1992.18. Shu Lin and D. J. Costello, “Error Control Coding: Fundamentals and Applications”, Prentice Hall, 1983.19. Simon Haykin, “Communication Systems”, John Wiley & sons, New York, 4th Edition, 2001.20. W. Stallings, “Cryptography and Network Security: Principles and Practice”, Second edition, Prentice Hall, 1999.21. William Stallings, “Data and Computer Communications”, 5th ed., Upper Saddle River, NJ: Prentice Hall, 1997. (408)
- 12. I NDEXA –random-error-correcting, 275Acknowledgement, –systematic, 259 –negative, 379, 382 Burst Error, 202, 211 –positive, 379 Burst Length, 208, 360Addition, –modulo–2, 247, 265 C –modulo–m, 228 Central Limit Theorem, 38 –vector, 247 Channel,Additive White Gaussian Noise, 192, 197 –additive White GaussianAnalog-to-digital (A/D) Converter, 185 noise, 115, 192Arithmetic; –binary erasure, 109 –binary field, 234 –binary symmetric, 108 –Galois field, 252 –burst - error, 203ARQ, –deterministic, 101 –Continuous, 204, 378, 381 –discrete memoryless, 98, 198 –No-back-N, 204, 381 –lossless, 101 –Hybrid, 204, 384 –noiseless, 102 –Selective-repeat, 204, 381 Channel Capacity, 107 –Stop-and-Wait, 204, 378 Channel encoder, 189 –Type-I hybrid, 204, 386 Characteristic of field, 234 –Type-II hybrid, 204, 387 Checksum, 213Auto Correlation, 60 Chien search, 330 Cipher-text, 392B Code efficiency, 137Bandwidth, 115, 192 Code length, 136BCH Codes, 303, 309 Code vector, 194 –binary, 310 Codeword, 137, 188 –decoding, 318 Correlation functions, –encoding, 316 –Auto Correlation, 60 –generator polynomial, 310 –Cross Correlation, 62 –non-binary, 334 Complementary error function, 199 –non primitive, 333 Constraint length, 340 –parity-check matrix, 315 Convolutional Codes, 195, 339 –primitive, 311 –burst-error-correcting, 360 –properties, 318 –constraint length, 340 –syndrome, 318 –distance properties, 358 –syndrome computation, 318 –generator matrix, 345BCH Decoder, 321 –structural properties, 346Berlekamp Iterative Algorithm, 326 –transfer function, 358Binary Erasure Channel (BEC), 87 Coset, 274Binary Operation, 225 Coset leader, 274Binary Symmetric Channel (BSC), 87, 202 Cryptography, 389Block Codes, – applications, 401 –binary, 257 – asymmetric key, 396 –burst-error-correcting, 275 — symmetric key, 393 –hamming, 277 Cyclic Codes, –interleaved, 276 –decoding, 297 –linear, 258 –encoder, 292 (409)
- 13. 410 I N F O R M AT I O N T H EO R Y AN D CODING –generator matrix, 290 Distribution Function, –generator polynomial, 285 –cummutative distribution function, 15 –Meggitt decoder, 297, 299 –joint distribution function, 20 –parity-check matrix, 291 –marginal distribution function, 21 –parity-check polynomial, 286 DSA algorithm, 399 –syndrome, 295 –systematic, 288 ECyclic Shifts, 284 Encoders, –channel, 189D –convolutional code, 196, 339Decoders, –cyclic code, 292 –channel, 191 –linear block code, 262 –maximum-likelihood, 201 –source, 136 –Meggitt, 297, 299 Encoding, 190 –Syndrome, 295 Encryption, 392Decoding, Entropy, 89 –BCH codes, 318 Ergodicity, 58 –cyclic codes, 297 Error control strategies, 203 –error-trapping, 299 –automatic-repeat request –maximum likelihood, 200 (ARQ), 204, 377 –syndrome, 273 –for digital storage system, 204 –viterbi, 196, 352 –forward error control, 203, 377Decryption, 393 Error Correction Capability, 271Demodulator, 197 Error Detection Capability, 271Determinist Signals, 1 Error Location Numbers, 321Digital-to-Analog (D/A) Conversion, 187,191 Error Location Polynomial, 318, 320Digital Communication System, 186 Error Patterns, –channel decoder, 191 –correctable, 379, 385 –channel encoder, 189 –detectable, 379 –decryption, 191 –uncorrectable, 385 –demodulator, 191 –undetectable, 265, 379 –destination, 186 Errors, –encryption, 188 –burst, 202, 277 –information source, 186 –random, 202, 277 –modulator, 190 –transmission, 265 –source decoder, 191 –undetected, 265, 378 –source encoder, 188 Event, 3, 94 –synchronisation, 191 Expectation, 26Digital signature, 397, 399 Experiment, 2, 94Discrete Memoryless Channel –head, 2 (DMC), 98,198 –trial, 2 –channel representation, 98 –tail, 2 –channel transition probability, 99 –channel matrix, 99 FDistance, Field, –Hamming, 270 –binary, 233 –minimum, 270 –finite, 233Distance Properties of Convolutional Codes, 357 –Galois, 233, 239Distributions, –prime, 233 –exponential, 29 Finite Field, 233 –Gaussian, 30 Forward Error Correction, 203 –Rayleigh, 32 Fundamental Theorem of Discrete Noiseless –uniform, 28 Coding, 154
- 14. INDEX 411G –linear code, 258Galois Field, 233, 239 –parity-check matrix, 263Galois Field Arithmetic, 252 –systematic codes, 259Gaussian Process, 73 Linearity Property, 284Generator Matrix, Linearly Dependent Vectors, 248 –block codes, 260 Linearly Independent Vectors, 248 –convolutional codes, 345 Location Numbers, 321 –cyclic codes, 291Generator Polynomial, M –BCH codes, 310 Markov Process, 74 –cyclic codes, 285 Matrix, –Galoy codes, 302 –channel, 99GF(2), 233 –generator, 259GF(2m ), 234 –identity, 263GF(P), 233 –parity-check, 263GF( q), 234 –transpose of, 249Galoy Codes, 302 Maximum Length Codes, 302Group, Maximum Likelihood Decoding, 200, 350 –cummutative, 227 Mean, 26 Meggitt Decoder, 297, 299H Minimal Polynomial, 311Hamming Bound, 273 Minimum Distance, 270Hamming Codes, 216, 277 Modulator, 190Hamming Distance, 270 Modulo-2 addition, 265Hamming Weight, 270 Modulo-m addition, 228Huffmans Minimum Redundancy Codes, 156 Modulo-m multiplication, 228Hybrid ARQ Schemes, 204, 384, 385 Moments, 27 Multiplication,I –modulo-m, 228Identity Element, 227 –scalar, 246Identity Matrix, 263 Mutual Information, 102Information Source, 186 –source alphabet, 188 N –symbol, 188 Negative Acknowledgement, 379, 382Information Theory, Newtons Identities, 323 –average information, 89 ( n,k) Block Code, 196, 257 –DMS, 88, 95 ( n,k, K) Convolutional Code, 196, 341 –information rate, 92 Noiseless Coding Theorem, 144 –Information sources, 88 Non-binary BCH Codes, 334 –Information, 88 Non-primitive BCH Codes, 333 –source alphabet, 88 n–tuple, 195Interleaving, 277 Null Space, 249Inverse, –additive, 232 O –multiplicative, 231 Optimal Codes, 139Irreducible Polynomial, 236 Order,Iterative Algorithm, 326, 328 –of a field, 231 –of a group, 228LLaw of Large Numbers, 37 PLinear Block Codes, Parity Check Matrix, –block code, 257 –BCH codes, 315 –generator matrix, 259 –block codes, 263
- 15. 412 I N F O R M AT I O N T H EO R Y AN D CODING –cyclic codes, 291 –digital, 187Plain text, 392 –encoder, 188Polynomials, –information, 186 –irreducible, 236 Source Encoding, –minimal, 311 –code efficiency, 137 –primitive, 312 –code length, 136Poisson Process, 75 –code redundancy, 137Positive Acknoweldgement, 379 –distinct codes, 137Power Spectral Density, 65 –instantaneous codes, 138Prime Numbers, 233 –Kraft– McMillan inequality, 140Primitive BCH Codes, 311 –optimal codes, 139Primitive Elements, 311 –prefix codes, 139Primitive Polynomials, 237, 312 –uniquely decodable codes, 138Probability, –variable length codes, 137 –conditional, 6 Space, 246 –properties of, 4 Span, 248 –transition, 98, 199 Spectral Densities,Probability Density Function, 17 –power spectral densities, 65 –conditional, 22 –energy spectral densities, 67 –joint, 21 State Diagram, 349 –marginal, 22 Stationary Process, –strict sense, 57R –wide sense, 58Random Error Correcting Codes, 277 Substitution techniques, 394Random Errors, 277 –Mono alphabetic, 394Random Signals, 2 –Poly alphabetic, 395Random Process, 54 Syndrome,Random Variables, 13 –BCH codes, 318 –continuous, 14 –block codes, 264 –discrete, 13 –cyclic codes, 295Rayleigh Distribution, 32 –decoding, 273Registers, Syndrome register, 298 –buffer, 383 Systematic codes, 259 –message, 262 –shift, 341 T –syndrome, 298 Theorem of Decodability, 149Repetition Code, 256 Throughput Efficiency,Representation of Galois Fields, 241 –go-back-N ARQ, 204, 383RSA Algorithm, 398 –selective-repeat ARQ, 204, 384Retransmission, 378 –stop-and-wait ARQ, 204, 380Round–trip Delay, 382 Transfer Function of Convolutional Codes, 358Response of Linear Systems, 69 Transition Probabilities, 99, 199 Transmission Errors, 265S Transposition techniques, 395Sample Space, 2 Transpose of a Matrix, 249Selective-Repeat ARQ, 204, 381, 383 Trellis Diagram, 348Separable Binary Codes, 139Sequence, 351 VShannons Binary Code, 152 Vectors, 246Shannon–Fano Coding, 141 Vector Addition, 247Single Error Correcting Codes, 310 Vector Space, 246Source, Veterbi Algorithm, 352 –decoder, 191

Be the first to comment