Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Eighth Asian-European Workshop on Information Theory: Fundamental Concepts in Information Theory

305 views

Published on

Kamakura, Kanagawa, JAPAN
May 17-19, 2013
Editors: Hiroyoshi Morita
A. J. Han Vinck
Te Sun Han
Akiko Manada

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Eighth Asian-European Workshop on Information Theory: Fundamental Concepts in Information Theory

  1. 1. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionNetwork Coding and PolyMatroid/Co-PolyMatroid:A Short SurveyJoe SuzukiOsaka UniversityMay 17-19, 2013Eighth Asian-European Workshop on Information TheoryKamakura, Kanagawa1 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  2. 2. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionRoad MapFrom Multiterminal Information Theory to Network CodingWhy Polymatroid/Co-Polymatroid?Comparing three papersFuture Problems.1 T. S. Han ”Slepian-Wolf-Cover theorem for a network of channels”,Inform. Control, vol. 47, no. 1, pp.67 -83 1980.2 R. Ahlswede , N. Cai , S. Y. R. Li and R. W. Yeung ”Networkinformation flow”, IEEE Trans. Inf. Theory, vol. IT-46, pp.1204-1216 20003 Han Te Sun “Multicasting Multiple Correlated Sources to MyltipleSinks over a Noisy Channel Network”, IEEE Trans. on Inform.Theory, Jan. 20112 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  3. 3. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionNetwork N = (V , E, C)G = (V , E): DAGV : finite set (nodes)E ⊂ {(i, j)|i ̸= j, i, j ∈ V } (edge)Φ, Ψ ⊂ V , Φ ∩ Ψ = ϕ (source and sink nodes)Source Xns = (X(1)s , · · · , X(n)s ) (s ∈ Φ): stationary ergodicXΦ = (Xs)s∈Φ, XT = (Xs)s∈T (T ⊂ Ψ)Channel C = (ci,j ), ci,j := limn→∞1nmaxXniI(Xni , Xnj ) (capacity)statistically independent for each (i, j) ∈ Estrong converse property3 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  4. 4. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionExisting Results assuming DAGsSinksSources single multiplesingle Ahlswede et. al. 2000multiple Han 1980 Han 20114 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  5. 5. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionCapacity Function ρN (S), S ⊂ Φ(M, ¯M): pair (cut) of M ⊂ V and ¯M := V M EM := {(i, j) ∈ E|i ∈ M, j ∈ ¯M} (cut set)c(M, ¯M) :=∑(i,j)∈E,i∈M,j∈ ¯Mcijρt(S) := minM:S⊂M,t∈ ¯Mc(M, ¯M)for each ϕ ̸= S ⊂ Φ, t ∈ ΨρN (S) := mint∈Ψρt(S)5 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  6. 6. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionExample 1Φ = {s1, s2}, Ψ = {t1, t2}, cij = 1, (i, j) ∈ Edd‚   © dd‚   ©  © dd‚   © dd‚c c cs1 s2t1 t2ρt1 ({s2}) = ρt2 ({s1}) = 1 , ρt1 ({s1}) = ρt2 ({s2}) = 2ρt1 ({s1, s2}) = ρt2 ({s1, s2}) = 2ρN ({s1}) = min(ρt1 ({s1}), ρt2 ({s2})) = 1ρN ({s2}) = min(ρt1 ({s2}), ρt2 ({s2})) = 1ρN ({s1, s2}) = min(ρt1 ({s1, s2}), ρt2 ({s1, s2})) = 26 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  7. 7. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionExample 2Φ = {s1, s2}, Ψ = {t1, t2}, 0 < p < 1, cij is replaced byh(p) := −p log2 p − (1 − p) log2(1 − p) for −→dd‚   © dd‚   ©  © dd‚   © dd‚c c cs1 s2t1 t2ρt1 ({s2}) = ρt2 ({s1}) = h(p) , ρt1 ({s1}) = ρt2 ({s2}) = 1 + h(p)ρt1 ({s1, s2}) = ρt2 ({s1, s2}) = min{1 + 2h(p), 2}ρN ({s1}) = min(ρt1 ({s1}), ρt1 ({s2})) = h(p)ρN ({s2}) = min(ρt1 ({s2}), ρt2 ({s2})) = h(p)ρN ({s1, s2}) = min(ρt1 ({s1, s2}), ρt2 ({s1, s2})) = min{1+2h(p), 2}7 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  8. 8. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 Conclusion(n, (Rij )(i,j)∈E , δ, ϵ)-codeXs: possible values Xs can takefsj : Xns → [1, 2n(Rsj −δ)] for each s ∈ Φ, (s, j) ∈ Ehsj = ψsj ◦ wsj ◦ φsj ◦ fsj : Xns → [1, 2n(Rsj −δ)]fij :∏k:(k,j)∈E[1, 2n(Rki −δ)] → [1, 2n(Rij −δ)] for each i ̸∈ Φ, (i, j) ∈ Ehij = ψij ◦ wij ◦ φij ◦ fij :∏k:(k,j)∈E[1, 2n(Rki −δ)] → [1, 2n(Rij −δ)]λn,t := Pr{ˆXΦ,t ̸= XnΦ} ≤ ϵgt :∏k:(k,t)∈E[1, 2n(Rkt −δ)] → XnΦ for each t ∈ Ψ8 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  9. 9. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionHan 1980 (|Ψ| = 1)Def: (Rij )(i,j)∈E is achievable for XΦ and G = (V , E)..(n, (Rij )(i,j)∈E , δ, ϵ)-code existsDef: XΦ is transmissible over N = (V , E, C)..(Rij + τ)(i,j)∈E is achievable for G = (V , E) and any τ > 0Theorem (|Ψ| = 1)XΦ is transmissible over N⇐⇒ H(XS |X¯S ) ≤ ρt(S) for Ψ = {t} and each ϕ ̸= S ⊂ ΦThe notion of network coding appeared first.9 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  10. 10. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionPolymatroid/Co-PolymatroidE: nonempty finite setDef: ρ : 2E → R≥0 is a polymatroid on E...1 0 ≤ ρ(X) ≤ |X|.2 X ⊂ Y ⊂ E =⇒ ρ(X) ≤ ρ(Y ).3 ρ(X) + ρ(Y ) ≥ ρ(X ∪ Y ) + ρ(X ∩ Y )Def: σ : 2E → R≥0 is a co-polymatroid on E.1 0 ≤ σ(X) ≤ |X|2 X ⊂ Y ⊂ E =⇒ σ(X) ≤ σ(Y )3 σ(X) + σ(Y ) ≤ σ(X ∪ Y ) + σ(X ∩ Y )H(XS |X¯S ) is a co-polymatroid on Φρt(S) = minM:S⊂M,t∈ ¯M c(M, ¯M) is a polymatroid on Φ10 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  11. 11. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 Conclusionco-polymatroid σ(S) and polymatroid ρ(S)Slepian-Wolf is available for proof of Direct Part{(Rs)s∈Φ|σ(S) ≤∑i∈SRi ≤ ρ(S), ϕ ̸= S ⊂ Φ} ̸= ϕ⇐⇒ σ(S) ≤ ρ(S) , ϕ ̸= S ⊂ ΦdddddddddddETR1R2a1b1a2b2a12b12a1 ≤ R1 ≤ b1a2 ≤ R2 ≤ b2a12 ≤ R1 + R2 ≤ b1211 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  12. 12. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionHan 2011Theorem (general)..XΦ is transmissible over N⇐⇒ H(XS |X¯S ) ≤ ρN (S) for each ϕ ̸= S ⊂ ΦThe proof is much more difficult..|Ψ| ̸= 1 ̸=⇒ ρN is not a polymatroidSlepian-Wolf cannot be assumed for proof of Direct Part:{(Rs)s∈Φ|H(XS |X¯S ) ≤∑i∈SRi ≤ ρN (S) , ϕ ̸= S ⊂ Φ}may be empty12 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  13. 13. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionExample 1 for uniform and independent X1, X2 ∈ {0, 1}Φ = {s1, s2}, Ψ = {t1, t2}, cij = 1, (i, j) ∈ Edd‚  ©dd‚  ©  ©dd‚  ©dd‚c c cs1 s2t1 t2dd‚  ©dd‚  ©  ©dd‚  ©dd‚c c cX1X2 X1X2X1 X2X1 X2X1 ⊕ X2ρN ({s1}) = min(ρt1 ({s1}), ρt2 ({s2})) = 1ρN ({s2}) = min(ρt1 ({s2}), ρt2 ({s2})) = 1ρN ({s1, s2}) = min(ρt1 ({s1, s2}), ρt2 ({s1, s2})) = 2H(X1|X2) = H(X1) = 1 , H(X2|X1) = H(X2) = 1H(X1X2) = H(X1) + H(X2) = 213 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  14. 14. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionExample 2 for binary symetric channel with probability pddd‚   ©ddd‚   ©   ©ddd‚   ©ddd‚c c cs1 s2t1 t2ddd‚   ©ddd‚   ©   ©ddd‚   ©ddd‚c c cX1X2 X1X2X1 X2X1 X2A(X1 ⊕ X2)AX1 AX2ρN ({s1}) = min(ρt1 ({s1}), ρt1 ({s2})) = h(p)ρN ({s2}) = min(ρt1 ({s2}), ρt2 ({s2})) = h(p)ρN ({s1, s2}) = min(ρt1 ({s1, s2}), ρt2 ({s1, s2})) = min{1+2h(p), 2}H(X1|X2) = h(p) , H(X2|X1) = h(p)H(X1X2) = 1 + h(p)A: m × n, m = nh(p) (K¨orner-Marton, 1979)14 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  15. 15. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionAhlswede et. al. 2000 (|Φ| = 1)Propose a coding scheme (α, β, γ-codes) to show thatΦ = {s}R = (Ri,j )(i,j)∈ETheorem (|Ψ| = 1)..R is achievable for Xs and G⇐⇒ the capacity of R is no less than H(Xs)α, β, γ-codes deal with non-DAG cases (with loop).(Ahlswede et. al. 2000 is included by Han 2011 but coversnon-DAG cases)15 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey
  16. 16. Introduction Preliminary Han 1980 Han 2011 Ahlswede et. al. 2000 ConclusionConclusionContribution..Short survey of the three papers.Future Work..Extension Han 2011 to the non-DAG case (with loop)16 / 16Network Coding and PolyMatroid/Co-PolyMatroid:, A Short Survey

×