Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Compressive Oversampling for Robust Data Transmission in Sensor Networks - Presented at INFOCOM 2010

841 views

Published on

http://nesl.ee.ucla.edu/document/edit/324

Data loss in wireless sensing applications is inevitable and while there have been many attempts at coping with this issue, recent developments in the area of Compressive Sensing (CS) provide a new and attractive perspective. Since many physical signals of interest are known to be sparse or compressible, employing CS, not only compresses the data and reduces effective transmission rate, but also improves the robustness of the system to channel erasures. This is possible because reconstruction algorithms for compressively sampled signals are not hampered by the stochastic nature of wireless link disturbances, which has traditionally plagued attempts at proactively handling the effects of these errors. In this paper, we propose that if CS is employed for source compression, then CS can further be exploited as an application layer erasure coding strategy for recovering missing data. We show that CS erasure encoding (CSEC) with random sampling is efficient for handling missing data in erasure channels, paralleling the performance of BCH codes, with the added benefit of graceful degradation of the reconstruction error even when the amount of missing data far exceeds the designed redundancy. Further, since CSEC is equivalent to nominal oversampling in the incoherent measurement basis, it is computationally cheaper than conventional erasure coding. We support our proposal through extensive performance studies.

Published in: Education, Technology, Business
  • Be the first to comment

Compressive Oversampling for Robust Data Transmission in Sensor Networks - Presented at INFOCOM 2010

  1. 1. Recovering Lost SensorData through Compressed Sensing Zainul CharbiwalaCollaborators:Younghun Kim, Sadaf Zahedi, Supriyo Chakraborty, Ting He (IBM), Chatschik Bisdikian (IBM), Mani Srivastava
  2. 2. The Big Picture Lossy Communication Link zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  3. 3. The Big Picture Lossy Communication Link zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  4. 4. The Big Picture Lossy Communication Link zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  5. 5. The Big Picture Lossy Communication LinkHow do we recover from this loss? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  6. 6. The Big Picture Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  7. 7. The Big Picture Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  8. 8. The Big Picture Generate Error Correction Bits Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets • Proactively encode the data with some protection bits zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  9. 9. The Big Picture Generate Error Correction Bits Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets • Proactively encode the data with some protection bits zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  10. 10. The Big Picture Generate Error Correction Bits Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets • Proactively encode the data with some protection bits zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  11. 11. The Big Picture Generate Error Correction Bits Lossy Communication LinkHow do we recover from this loss? • Retransmit the lost packets • Proactively encode the data with some protection bits • Can we do something better ? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 2
  12. 12. The Big Picture - Using Compressed Sensing Lossy Communication Link CSEC zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  13. 13. The Big Picture - Using Compressed Sensing Lossy Communication Link CSEC zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  14. 14. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  15. 15. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  16. 16. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  17. 17. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC Recover from Received Compressed MeasurementsHow does this work ? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  18. 18. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC Recover from Received Compressed MeasurementsHow does this work ? • Use knowledge of signal model and channel zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  19. 19. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC Recover from Received Compressed MeasurementsHow does this work ? • Use knowledge of signal model and channel • CS uses randomized sampling/projections zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  20. 20. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC Recover from Received Compressed MeasurementsHow does this work ? • Use knowledge of signal model and channel • CS uses randomized sampling/projections • Random losses look like additional randomness ! zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  21. 21. The Big Picture - Using Compressed Sensing Generate Compressed Measurements Lossy Communication Link CSEC Recover from Received Compressed MeasurementsHow does this work ? • Use knowledge of signal model and channel • CS uses randomized sampling/projections • Random losses look like additional randomness ! Rest of this talk focuses on describing “How” and “How Well” this works zainul@ee.ucla.edu - CSEC - Infocom - March 2010 3
  22. 22. Talk Outline‣ A Quick Intro to Compressed Sensing‣ CS Erasure Coding for Recovering Lost Sensor Data‣ Evaluating CSEC’s cost and performance‣ Concluding Remarks zainul@ee.ucla.edu - CSEC - Infocom - March 2010 4
  23. 23. Why Compressed Sensing ? Physical Sampling Compression Communication Application Signal Computationally expensive zainul@ee.ucla.edu - CSEC - Infocom - March 2010 5
  24. 24. Why Compressed Sensing ? Physical Sampling Compression Communication Application Signal Computationally expensive Physical Compressive Communication Decoding Application Signal Sampling Shifts computation to a capable server zainul@ee.ucla.edu - CSEC - Infocom - March 2010 5
  25. 25. Compressed Sensing - Some Intuition Bandwidth Frequency zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  26. 26. Compressed Sensing - Some Intuition Bandwidth FrequencyHow do you acquire this signal? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  27. 27. Compressed Sensing - Some Intuition Bandwidth FrequencyHow do you acquire this signal? • Nyquist rate - twice the bandwidth zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  28. 28. Compressed Sensing - Some Intuition Bandwidth FrequencyHow do you acquire this signal? • Nyquist rate - twice the bandwidth • But what if you knew more about the signal? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  29. 29. Compressed Sensing - Some Intuition Bandwidth FrequencyHow do you acquire this signal? • Nyquist rate - twice the bandwidth • But what if you knew more about the signal? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  30. 30. Compressed Sensing - Some Intuition Bandwidth FrequencyHow do you acquire this signal? • Nyquist rate - twice the bandwidth • But what if you knew more about the signal? • CS enables signal acquisition based on information content zainul@ee.ucla.edu - CSEC - Infocom - March 2010 6
  31. 31. Transform Domain Analysis zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  32. 32. Transform Domain Analysis‣ We usually acquire signals in the time or spatial domain zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  33. 33. Transform Domain Analysis‣ We usually acquire signals in the time or spatial domain‣ By looking at the signal in another domain, the signal may be represented more compactly zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  34. 34. Transform Domain Analysis‣ We usually acquire signals in the time or spatial domain‣ By looking at the signal in another domain, the signal may be represented more compactly‣ Eg: a sine wave can be expressed by 3 parameters: frequency, amplitude and phase. zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  35. 35. Transform Domain Analysis‣ We usually acquire signals in the time or spatial domain‣ By looking at the signal in another domain, the signal may be represented more compactly‣ Eg: a sine wave can be expressed by 3 parameters: frequency, amplitude and phase.‣ Or, in this case, by the index of the FFT coefficient and its complex value zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  36. 36. Transform Domain Analysis‣ We usually acquire signals in the time or spatial domain‣ By looking at the signal in another domain, the signal may be represented more compactly‣ Eg: a sine wave can be expressed by 3 parameters: frequency, amplitude and phase.‣ Or, in this case, by the index of the FFT coefficient and its complex value‣ Sine wave is sparse in frequency domain zainul@ee.ucla.edu - CSEC - Infocom - March 2010 7
  37. 37. Acquiring a Sine Wave zainul@ee.ucla.edu - CSEC - Infocom - March 2010 8
  38. 38. Acquiring a Sine Wave‣ Assume we’re interesting in acquiring a single sine wave x(t) in a noiseless environment zainul@ee.ucla.edu - CSEC - Infocom - March 2010 8
  39. 39. Acquiring a Sine Wave‣ Assume we’re interesting in acquiring a single sine wave x(t) in a noiseless environment‣ An infinite duration sine wave can be expressed using three parameters: frequency f, amplitude a and phase φ. zainul@ee.ucla.edu - CSEC - Infocom - March 2010 8
  40. 40. Acquiring a Sine Wave‣ Assume we’re interesting in acquiring a single sine wave x(t) in a noiseless environment‣ An infinite duration sine wave can be expressed using three parameters: frequency f, amplitude a and phase φ.‣ Question: What’s the best way to find the parameters ? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 8
  41. 41. Acquiring a Sine Wave zainul@ee.ucla.edu - CSEC - Infocom - March 2010 9
  42. 42. Acquiring a Sine Wave‣ Technically, to estimate three parameters one needs three good measurements zainul@ee.ucla.edu - CSEC - Infocom - March 2010 9
  43. 43. Acquiring a Sine Wave‣ Technically, to estimate three parameters one needs three good measurements‣ Questions: zainul@ee.ucla.edu - CSEC - Infocom - March 2010 9
  44. 44. Acquiring a Sine Wave‣ Technically, to estimate three parameters one needs three good measurements‣ Questions: ‣ What are “good” measurements ? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 9
  45. 45. Acquiring a Sine Wave‣ Technically, to estimate three parameters one needs three good measurements‣ Questions: ‣ What are “good” measurements ? ‣ How do you estimate f, a, φ from three measurements ? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 9
  46. 46. Compressed Sensing zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  47. 47. Compressed Sensing‣ With three samples: z1, z2, z3 of the sine wave at times t1, t2, t3 zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  48. 48. Compressed Sensing‣ With three samples: z1, z2, z3 of the sine wave at times t1, t2, t3‣ We know that any solution of f, a and φ must meet the three constraints and spans a 3D space: zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  49. 49. Compressed Sensing‣ With three samples: z1, z2, z3 of the sine wave at times t1, t2, t3‣ We know that any solution of f, a and φ must meet the three constraints and spans a 3D space: z i = x(t i ) = a sin(2π ft i + φ ) ∀i ∈{1, 2, 3} zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  50. 50. Compressed Sensing‣ With three samples: z1, z2, z3 of the sine wave at times t1, t2, t3‣ We know that any solution of f, a and φ must meet the three constraints and spans a 3D space: z i = x(t i ) = a sin(2π ft i + φ ) ∀i ∈{1, 2, 3} φ ‣ Feasible solution space is much smaller a zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  51. 51. Compressed Sensing‣ With three samples: z1, z2, z3 of the sine wave at times t1, t2, t3‣ We know that any solution of f, a and φ must meet the three constraints and spans a 3D space: z i = x(t i ) = a sin(2π ft i + φ ) ∀i ∈{1, 2, 3} φ ‣ Feasible solution space is much smaller‣ As the number of constraints grows from more measurements, the feasible solution a space shrinks‣ Exhaustive search over this space reveals the right answer knowing presence of one sine zainul@ee.ucla.edu - CSEC - Infocom - March 2010 10
  52. 52. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  53. 53. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  54. 54. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. Sine wave. Amplitude represented by x color zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  55. 55. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. Sine wave. Amplitude represented by Ψ (Fourier Transform) x color zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  56. 56. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. Sine wave. Amplitude represented by y = Ψ (Fourier Transform) x color zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  57. 57. Formulating the Problem‣ We could also represent f, a and φ as a very long, but mostly empty FFT coefficient vector. Sine wave. Amplitude represented by y = Ψ (Fourier Transform) x color − j2 π ft + φ ae zainul@ee.ucla.edu - CSEC - Infocom - March 2010 11
  58. 58. Sampling Matrix‣ We could also write out the sampling process in matrix form zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  59. 59. Sampling Matrix‣ We could also write out the sampling process in matrix form x zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  60. 60. Sampling Matrix‣ We could also write out the sampling process in matrix form Φ x zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  61. 61. Sampling Matrix‣ We could also write out the sampling process in matrix form z = Φ x zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  62. 62. Sampling Matrix‣ We could also write out the sampling process in matrix form z = Φ x Three non-zero entries at some “good” locations zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  63. 63. Sampling Matrix‣ We could also write out the sampling process in matrix formThree measurements z = Φ x Three non-zero entries at some “good” locations zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  64. 64. Sampling Matrix‣ We could also write out the sampling process in matrix formThree measurements z = Φ x k n Three non-zero entries at some “good” locations zainul@ee.ucla.edu - CSEC - Infocom - March 2010 12
  65. 65. Exhaustive Search‣ Objective of exhaustive search: ‣ Find an estimate of the vector y that meets the constraints and is the most compact representation of x (also called the sparsest representation)‣ Our search is now guided by the fact that y is a sparse vector‣ Rewriting constraints: z = Φx y = Ψx −1 z = ΦΨ y Constraints from measurements zainul@ee.ucla.edu - CSEC - Infocom - March 2010 13
  66. 66. Exhaustive Search‣ Objective of exhaustive search: ‣ Find an estimate of the vector y that meets the constraints and is the most compact representation of x (also called the sparsest representation)‣ Our search is now guided by the fact that y is a sparse vector‣ Rewriting constraints: ˆ % y = arg min y l 0 % y z = Φx y = Ψx % s.t. z = ΦΨ y −1 z = ΦΨ y −1 y l0 @ {i : yi ≠ 0} Constraints from measurements zainul@ee.ucla.edu - CSEC - Infocom - March 2010 13
  67. 67. Exhaustive Search‣ Objective of exhaustive search: ‣ Find an estimate of the vector y that meets the constraints and is the most compact representation of x (also called the sparsest representation)‣ Our search is now guided by the fact that y is a sparse vector‣ Rewriting constraints: ˆ % y = arg min y l 0 % y z = Φx y = Ψx % s.t. z = ΦΨ y −1 z = ΦΨ y −1 y l0 @ {i : yi ≠ 0} Constraints from This optimization problem measurements is NP-Hard ! zainul@ee.ucla.edu - CSEC - Infocom - March 2010 13
  68. 68. l1 Minimization‣ Approximate the l0 norm to an l1 norm ˆ % y = arg min y l 1 % y y = ∑ yi l1 % −1 i s.t. z = ΦΨ y‣ This problem can now be solved efficiently using linear programming techniques‣ This approximation was not new‣ The big leap in Compressed Sensing was a theorem that showed that under the right conditions, this approximation is exact! zainul@ee.ucla.edu - CSEC - Infocom - March 2010 14
  69. 69. Some CS Results‣ Theorem: If k samples of a length n signal are acquired uniformly randomly (if each sample is equiprobable) and reconstruction is performed in the Fourier basis: k [Rudelson06] s≤C · 4 ′ w.h.p. log (n)‣ Where s is the sparsity of the signal zainul@ee.ucla.edu - CSEC - Infocom - March 2010 15
  70. 70. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn zainul@ee.ucla.edu - CSEC - Infocom - March 2010 16
  71. 71. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Missing Communication samples When communication channel is lossy: zainul@ee.ucla.edu - CSEC - Infocom - March 2010 16
  72. 72. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Missing Communication samples When communication channel is lossy: • Use retransmissions to recover lost data zainul@ee.ucla.edu - CSEC - Infocom - March 2010 16
  73. 73. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Missing Communication samples When communication channel is lossy: • Use retransmissions to recover lost data • Or, use error (erasure) correcting codes zainul@ee.ucla.edu - CSEC - Infocom - March 2010 16
  74. 74. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Missing Communication samples zainul@ee.ucla.edu - CSEC - Infocom - March 2010 17
  75. 75. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Recovered Channel Channel Missing Communication compressed Coding Decoding samples domain samples zainul@ee.ucla.edu - CSEC - Infocom - March 2010 17
  76. 76. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz nxn Recovered Channel Channel Missing Communication compressed Coding Decoding samples domain samples + w = Ωy wl = Cw y = ( CΩ ) wl ˆ mxn m>n zainul@ee.ucla.edu - CSEC - Infocom - March 2010 17
  77. 77. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz Done at nxn application layer Recovered Channel Channel Missing Communication compressed Coding Decoding samples domain samples + w = Ωy wl = Cw y = ( CΩ ) wl ˆ mxn m>n zainul@ee.ucla.edu - CSEC - Infocom - March 2010 17
  78. 78. Handling Missing Data - Traditional Approach Physical Compressed Sampling Compression Signal domain samples nx ∈° z = In x y = Ψz Done at nxn application layer Recovered Channel Channel Missing Communication compressed Coding Decoding samples domain samples + w = Ωy wl = Cw y = ( CΩ ) wl ˆ mxn m>n Done at physical layer Can’t exploit signal characteristics zainul@ee.ucla.edu - CSEC - Infocom - March 2010 17
  79. 79. CS Erasure Coding ApproachPhysical Compressive Compressed Communication Decoding Signal Sampling domain samples nx ∈° z = Φx zl = Cz % y = arg min y l 1 kxn % y k<n % s.t. zl = CΦΨ y −1 zainul@ee.ucla.edu - CSEC - Infocom - March 2010 18
  80. 80. CS Erasure Coding ApproachPhysical Compressive Compressed Communication Decoding Signal Sampling domain samples nx ∈° z = Φx zl = Cz % y = arg min y l 1 kxn % y k<n % s.t. zl = CΦΨ y −1Physical Compressive Compressed Communication Decoding Signal Sampling domain samples nx ∈° z = Φx zl = Cz % y = arg min y l 1 mxn % y k<m<n % s.t. zl = CΦΨ y −1 zainul@ee.ucla.edu - CSEC - Infocom - March 2010 18
  81. 81. CS Erasure Coding ApproachPhysical Compressive Compressed Communication Decoding Signal Sampling domain samples nx ∈° z = Φx zl = Cz % y = arg min y l 1 kxn % y k<n % s.t. zl = CΦΨ y −1 Over-sampling in CS is Erasure Coding !Physical Compressive Compressed Communication Decoding Signal Sampling domain samples nx ∈° z = Φx zl = Cz % y = arg min y l 1 mxn % y k<m<n % s.t. zl = CΦΨ y −1 zainul@ee.ucla.edu - CSEC - Infocom - March 2010 18
  82. 82. Effects of Missing Samples on CS z = Φ x zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  83. 83. Effects of Missing Samples on CS z = Φ x Missingsamples at the receiver zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  84. 84. Effects of Missing Samples on CS z = Φ x Missingsamples at the Same as missing receiver rows in the sampling matrix zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  85. 85. Effects of Missing Samples on CS z = Φ xWhat happens if we over-sample? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  86. 86. Effects of Missing Samples on CS z = Φ xWhat happens if we over-sample? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  87. 87. Effects of Missing Samples on CS z = Φ xWhat happens if we over-sample? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  88. 88. Effects of Missing Samples on CS z = Φ xWhat happens if we over-sample? • Can we recover the lost data? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  89. 89. Effects of Missing Samples on CS z = Φ xWhat happens if we over-sample? • Can we recover the lost data? • How much over-sampling is needed? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 19
  90. 90. Extending CS Results‣ Claim: When m>k samples are acquired uniformly randomly and communicated through a memoryless binary erasure channel that drops m-k samples, the received k samples are still equiprobable. ‣ Implies that bound on sparsity condition should hold. ‣ If bound is tight, over-sampling rate (m-k) is same as loss rate [This paper] zainul@ee.ucla.edu - CSEC - Infocom - March 2010 20
  91. 91. Features of CS Erasure Coding‣ No need of additional channel coding block‣ Redundancy achieved by oversampling‣ Recovery is resilient to incorrect channel estimates ‣ Traditional channel coding fails if redundancy is inadequate‣ Decoding is free if CS was used for compression anyway zainul@ee.ucla.edu - CSEC - Infocom - March 2010 21
  92. 92. Features of CS Erasure Coding‣ No need of additional channel coding block‣ Redundancy achieved by oversampling‣ Recovery is resilient to incorrect channel estimates ‣ Traditional channel coding fails if redundancy is inadequate‣ Decoding is free if CS was used for compression anyway‣ Intuition: ‣ Channel Coding spreads information out over measurements ‣ Compression (Source Coding) - compact information in few measurements ‣ CSEC - spreads information while compacting ! zainul@ee.ucla.edu - CSEC - Infocom - March 2010 21
  93. 93. Signal Recovery Performance Evaluation Create CS Lossy CS Reconstruction Signal Sampling Channel Recovery Error? zainul@ee.ucla.edu - CSEC - Infocom - March 2010 22
  94. 94. In Memoryless Channels Baseline performance - No Loss zainul@ee.ucla.edu - CSEC - Infocom - March 2010 23
  95. 95. In Memoryless Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability zainul@ee.ucla.edu - CSEC - Infocom - March 2010 23
  96. 96. In Memoryless Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability 20 % Oversampling - complete recovery zainul@ee.ucla.edu - CSEC - Infocom - March 2010 23
  97. 97. In Memoryless Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability 20 % Oversampling - complete recovery Less than 20 % Oversampling - recovery does not fail completely zainul@ee.ucla.edu - CSEC - Infocom - March 2010 23
  98. 98. In Bursty Channels Baseline performance - No Loss zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  99. 99. In Bursty Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  100. 100. In Bursty Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability 20 % Oversampling - doesn’t recover completely zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  101. 101. In Bursty Channels Baseline performance - No Loss 20 % Loss - Drop in recovery probability Oversampling + Interleaving - Still incomplete recovery 20 % Oversampling - doesn’t recover completely zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  102. 102. In Bursty Channels Worse than baseline Baseline performance - No Loss 20 % Loss - Drop in recovery probability Oversampling + Interleaving - Still incomplete recovery 20 % Oversampling - doesn’t recover completely Better than baseline‣ Recovery incomplete because of low interleaving depth‣ Recovery better at high sparsity because bursty channels deliver bigger packets on average, but with higher variance zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  103. 103. In Bursty Channels Worse than baseline Baseline performance - No Loss 20 % Loss - Drop in recovery probability Oversampling + Interleaving - Still incomplete recovery 20 % Oversampling - doesn’t recover completely Better than baseline‣ Recovery incomplete because of low interleaving depth‣ Recovery better at high sparsity because bursty channels deliver bigger packets on average, but with higher variance zainul@ee.ucla.edu - CSEC - Infocom - March 2010 24
  104. 104. In Real 802.15.4 Channel Baseline performance - No Loss zainul@ee.ucla.edu - CSEC - Infocom - March 2010 25
  105. 105. In Real 802.15.4 Channel Baseline performance - No Loss 15 % Loss - Drop in recovery probability zainul@ee.ucla.edu - CSEC - Infocom - March 2010 25
  106. 106. In Real 802.15.4 Channel Baseline performance - No Loss 15 % Loss - Drop in recovery probability 15 % Oversampling - complete recovery zainul@ee.ucla.edu - CSEC - Infocom - March 2010 25
  107. 107. In Real 802.15.4 Channel Baseline performance - No Loss 15 % Loss - Drop in recovery probability 15 % Oversampling - complete recovery Less than 15 % Oversampling - recovery does not fail completely zainul@ee.ucla.edu - CSEC - Infocom - March 2010 25
  108. 108. Cost of CSEC 5 Rnd ADC FFT Radio TX RS 4 Energy/block (mJ) 3 2 1 0 m=256 S-n-S m=10 C-n-S m=64 CS k=320 S-n-S+RS k=16 C-n-S+RS k=80 CSEC Sense Sense, CS Sense Sense, CSEC and Compress and and Compress and Send (FFT) Send Send and Send and (1/4th with Send Send rate) Reed with Solomon RS No robustness guarantees zainul@ee.ucla.edu - CSEC - Infocom - March 2010 26
  109. 109. Cost of CSEC 5 Rnd ADC FFT Radio TX RS 4 Energy/block (mJ) 3 2 1 0 m=256 S-n-S m=10 C-n-S m=64 CS k=320 S-n-S+RS k=16 C-n-S+RS k=80 CSEC Sense Sense, CS Sense Sense, CSEC and Compress and and Compress and Send (FFT) Send Send and Send and (1/4th with Send Send rate) Reed with Solomon RS No robustness All options equally guarantees robust (w.h.p.) zainul@ee.ucla.edu - CSEC - Infocom - March 2010 26
  110. 110. Cost of CSEC 5 Rnd ADC FFT Radio TX RS 4 2.5x Energy/block (mJ) lower 3 2 energy 1 0 m=256 S-n-S m=10 C-n-S m=64 CS k=320 S-n-S+RS k=16 C-n-S+RS k=80 CSEC Sense Sense, CS Sense Sense, CSEC and Compress and and Compress and Send (FFT) Send Send and Send and (1/4th with Send Send rate) Reed with Solomon RS No robustness All options equally guarantees robust (w.h.p.) zainul@ee.ucla.edu - CSEC - Infocom - March 2010 26
  111. 111. Summary‣ Oversampling is a valid erasure coding strategy for compressive reconstruction‣ For binary erasure channels, an oversampling rate equal to loss rate is sufficient‣ CS erasure coding can be rate-less like fountain codes ‣ Allows adaptation to varying channel conditions‣ Can be computationally more efficient on transmit side than traditional erasure codes zainul@ee.ucla.edu - CSEC - Infocom - March 2010 27
  112. 112. Closing Remarks‣ CSEC spreads information out while compacting ‣ No free lunch syndrome: Data rate requirement is higher than if using good source and channel coding independently ‣ But, then, computation cost is higher too‣ CSEC requires knowledge of signal model ‣ If signal is non-stationary, model needs to be updated during recovery ‣ This can be done using over-sampling too‣ CSEC requires knowledge of channel conditions ‣ Can use CS streaming with feedback zainul@ee.ucla.edu - CSEC - Infocom - March 2010 28
  113. 113. Thank You

×