Towards a theory of semantic communication
Upcoming SlideShare
Loading in...5
×
 

Towards a theory of semantic communication

on

  • 3,851 views

 

Statistics

Views

Total Views
3,851
Views on SlideShare
3,834
Embed Views
17

Actions

Likes
0
Downloads
15
Comments
0

3 Embeds 17

http://www.linkedin.com 10
https://www.linkedin.com 4
http://paper.li 3

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • International Security Assistance Force  ( ISAF )
  • PB: Expressed message => Technical Message encoding is deterministic Replace agent
  • What’s new? How to validate? Who cares?
  • For another example of information source in a broader sense, information carried by DNA are encoded using a four-letter alphabet (bases A, G, C, U). DNA's syntactical entropy can be obtained using statistical studies of bases or sequences of bases, with estimation ranging from 1.6 to 1.9 bits per base~\\cite{LY96,citeulike:769969,schmidt1997}. However, the ``semantics'' of DNA is only expressed after a complex process, producing functional gene products such as RNAs or proteins. The process is not yet fully understood, but it has been observed that variations of DNA do not necessarily result in different gene products [Citation of degenracy of DNA code], nor DNA will be expressed in the exactly same way under different conditions [citation of genetic expression variation]. If we measure the amount of information carried in a DNA based on its functional gene products, our conjecture is that it might be different from the DNA's syntactical entropy.

Towards a theory of semantic communication Towards a theory of semantic communication Presentation Transcript

  • Towards a Theory of Semantic Communication Jie Bao, RPI Joint work with Prithwish Basu, Mike Dean, Craig Partridge, Ananthram Swami, Will Leland and Jim Hendler
  • Outline
    • Background
    • A general semantic communication model
    • Measuring semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Shannon, 1948
    • “ The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning;... These semantic aspects of communication are irrelevant to the engineering problem .”
    Claude E. Shannon. A mathematical theory of communication. Bell System Technical Journal, 27:379-423, 625-56, 1948. message message Signal Signal
  • But, are these just sequences of bits?
    • Movie streams
    • Software codes
    • DNA sequences
    • Emails
    • Tweets
    • ……
    “ The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning;..” “ These semantic aspects of communication are irrelevant to the engineering problem ”?
  • Between a Talent Manager & Me
    • “ Are you open to discuss greener pastures”?
    “ Thanks for contacting me. However, I'm not sure if my research is related to " greener pastures ". I'm a computer scientist.”
  • Misunderstanding can be costly Mars Climate Orbiter (1998-1999), $125 million Expressed Pound (lb F ) Interpreted Newton (N) Image Source: Wikipedia, http://en.wikipedia.org/wiki/Mars_Climate_Orbiter#Communications_loss
  • Misunderstanding can be deadly
    • Afghan National Army (ANA) to ISAF
    • “ Launch flares over the left side of the village ”
    • Received and Understood as
    • “ fire on the left side of the village ”
    • Alternative semantic coding (e.g., illuminating shell) may save lives!
    Scenario based on report from http://www.closeprotectionworld.co.uk/security-news-asia/37466-afghanistan-war-what-happens-when-war-interpreter-doesnt-know-language.html (Noisy) Battlefield Communication
  • Our Contributions
    • We develop a generic model of semantic communication, extending the classic model-theoretical work of (Carnap and Bar-Hillel 1952) ;
    • We discuss the role of semantics in reducing source redundancy , and potential approaches for lossless and lossy semantic data compression;
    • We define the notions of semantic noise, semantic channel , and obtain the semantic capacity of a channel.
  • Outline
    • Background
    • A general semantic communication model
    • Measuring Semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Shannon, 1948 message message Shannon Model Signal Signal Expressed Message (e.g., commands and reports) Expressed Message From IT to SIT (Classical) Information Theory Semantic Information Theory Semantic Channel
  • A Three-level Model (Weaver) Transmitter Receiver Destination Source Physical Channel Technical message Technical Noise Intended message Expressed message Semantic Transmitter Semantic Receiver Semantic Noise Shared knowledge Local knowledge Local knowledge (effectiveness factors) C: Effectiveness B: Semantic A: Technical
  • A Semantic Communication Model Message generator World model Background Knowledge Inference Procedure Messages Sender Message interpreter World model Background Knowledge Inference Procedure Receiver W s W r K s K r I s I r {m} World M: Message Syntax Feedback (?) observations Ms Mr
  • Semantic Sources
    • Key : A semantic source tells something that is “ true ”
      • Engineering bits are neither true or false!
    • Goal : 1) more soundness (sent as “true”->received as “true”); 2) less ambiguity
  • Outline
    • Background
    • A general semantic communication model
    • Measuring semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Measuring Semantic Information
    • Basic Problem : What is the amount of “semantics” carried by a source and its messages?
  • Measuring Semantic Information
    • Statistical approach : Inference may change the distribution of symbols, hence the entropy of the source.
    • Model-theoretical approach : The less “likely” a message is to be true , the more information it contains.
    • Algorithmic approach : What’s the minimal program needed to describe messages and their deductions?
    • Situation-theoretical approach : measuring the divergence of messages to “truth”.
    Our Approach
  • Shannon: Information = “surpriseness” H( tyrannosaurus ) > H(dog) Captured from: http://www.wordcount.org/main.php
  • Which sentence is more “surprising”? ``Rex is not a tyrannosaurus'' ``Rex is not a dog''
  • Model Semantics
    • tyrannosaurus
    • dog
    ?? ??
  • “ Semantics” of DNA Image courtesy: http://www.yourdictionary.com/dna http://www.pnl.gov/biology/images/protein_molecule.jpg “ Syntax” Model (“Semantics”) Gene expression
  • Stone-age Semantic Communication
    • Semantic communication predates symbolic communications
    Altamira Cave Painting http://mandyking.files.wordpress.com/2011/02/altamira-cave.jpg
  • Semantics of Messages
    • Messages are expressions , not just sequences of symbols
      • E.g., Saturday->Weekend, Sunny & Cold
    • If an expression is more commonly true , it contains less semantic information
      • inf (Sunny & Cold) > inf (Cold)
      • inf (Cold) > inf (Cold or Warm)
  • Semantics of Messages
    • Carnap & Bar-Hillel (1952) - “An outline of a theory of semantic information”
      • m(exp) = |mod(exp)| / |all models|
      • inf(exp) = - log m(exp)
    • Example
      • m(A v B) = ¾, m(A ^ B)=1/4
      • Inf(A v B)=0.415, inf(A^B )= 2
  • Knowledge Entropy
    • Extending Carnap & Bar-Hillel (1952)
      • Models have a distribution
      • Background knowledge may present
    Weekend=2/7, Saturday=1/7
  • Knowledge Entropy
    • Logical prob. and knowledge entropy of Messages
    • Model entropy of an information source
    model distribution logical probability
  • Semantic Information Calculator (Demo)
    • http://www.cs.rpi.edu/~baojie/sit/index.php
  • Outline
    • Background
    • A general semantic communication model
    • Measuring Semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Conditional Knowledge Entropy
    • When there is background knowledge, the set of possible worlds decreases.
  • Model Compression with Shared Knlg
    • Background knowledge (A->B), when shared, help compress the source
      • Side information in the form of entailment
  • Lossless Message Compression
    • Theorem : There is a semantically lossless code for source X, with message entropy H >= H(X eq ); no such code exists for H < H(X eq )
      • X eq are equivalent classes of X
    • Example: no need for coding both “pig” and “swine”, using one of them is sufficient.
    • Example 2: a->(a^b)v(b^c) = a->b
    • Sometime, the loss is intentional compression
      • Textual description of an image
      • Abstract of a paper
  • Other Source Coding Strategies
    • Lossless model compression
      • E.g., using minimal models
    • Lossy message compression
      • E.g., compressing based on semantic similarity
    • Leave as future work
  • Outline
    • Background
    • A general semantic communication model
    • Measuring Semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Semantic Noise
    • Examples
    • The meaning of a message is changed due to technical noises , e.g., from ``flare'' to ``fire'‘;
    • Semantic mismatch: The source / receiver use different background knowledge or inference (e.g., during the loss of the Mars Climate Orbiter);
    • Lost in translation : “Uncle” in English has no exact correspondence in Chinese.
  • Semantic Noise and Channel Coding “ coffee machine” “ copy machine” “ Xerox ” “ Xerox” “ copy machine” p->ff ? ? 0.9 0.1 1.0 W X Y W’ Scenario developed based on reports in http://english.visitkorea.or.kr/enu/AK/AK_EN_1_6_8_5.jsp and   http://blog.cleveland.com/metro/2011/03/identifying_photocopy_machine.html
  • Semantic Channel Coding Theorem
    • In the simplified model, assume no semantic mismatch (K s =K r , I s =I r )
    • Theorem 3: If transmission rate is smaller than C s (semantic channel capacity), error-free coding exists
    • Semantic channel capacity may be higher or lower than the engineering channel capacity (sup I(X;Y)) !
      • H(W|X) stands for encoder’s semantic ambiguity
      • avg(inf(Y)) is receiver’s “smartness”
  • Outline
    • Background
    • A general semantic communication model
    • Measuring Semantics
    • Semantic data compression (source coding)
    • Semantic reliable communication (channel coding)
    • Path ahead
  • Application in Coding & Validation
    • Hypothesis 1: using semantics we can achieve better data compression
    • Hypothesis 2: using semantics we can achieve more reliable communication
    • Validation with comparison to non-semantic algorithms
  • Extensions
    • Extensions & connections to other fields
      • First-order languages [probabilistic logics]
      • Inconsistent KBs (misinformation) [paraconsistent logics]
      • Lossy source coding [clustering and similarity measurement]
      • Semantic mismatches [extending Juba & Sudan 2011]
      • … …
  • Path ahead – Broad Impact
      • Communications (e.g., coding)
      • Linguistics (e.g., entropy of English)
      • Biology (e.g., semantics of genes)
      • Economics
      • … .
      • Areas wherever Shannon’s theory applies
      • And beyond (e.g., Semantic Web, ontology engineering)
  • Questions? Image courtesy: http://www.addletters.com/pictures/bart-simpson-generator/900788.htm