Upcoming SlideShare
Loading in …5
×

# 집합모델 확장불린모델

500 views

Published on

정보검색시스템 강의노트 강승식교수님

0 Comments
1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
Your message goes here
• Be the first to comment

No Downloads
Views
Total views
500
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
9
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

### 집합모델 확장불린모델

1. 1. 2.6 Alternative Set Theoretic Models <ul><li>Fuzzy Set Model </li></ul><ul><li>Extended Boolean Model </li></ul>
2. 2. 2.6.1 Fuzzy Set Model <ul><li>Fuzzy Set Theory </li></ul><ul><ul><li>Deals with the representation of classes whose boundaries are not well defined </li></ul></ul><ul><ul><li>Membership in a fuzzy set is a notion intrinsically gradual instead of abrupt (as in conventional Boolean logic) </li></ul></ul>tall very tall 1 0 height 1 0 height Fuzzy Membership Conventional Membership
3. 3. Fuzzy Set Model (Cont.) <ul><li>Definition </li></ul><ul><li>Definition </li></ul>
4. 4. Fuzzy Set Model (Cont.) <ul><li>Fuzzy information retrieval </li></ul><ul><ul><li>Representing documents and queries through sets of keywords yields descriptions which are only partially related to the real semantic contents of the respective documents and queries </li></ul></ul><ul><ul><li>Each query term defines a fuzzy set </li></ul></ul><ul><ul><li>Each document has a degree of membership in this set </li></ul></ul><ul><li> Rank the documents relative to the user query </li></ul>
5. 5. 2.6.2 Extended Boolean Model <ul><li>Motivation </li></ul><ul><ul><li>Boolean Model </li></ul></ul><ul><ul><ul><li>Simple and elegant </li></ul></ul></ul><ul><ul><ul><li>No provision for term weighting </li></ul></ul></ul><ul><ul><ul><li>No ranking of the answer set </li></ul></ul></ul><ul><ul><ul><li>Output might be too large or too small </li></ul></ul></ul><ul><ul><li>Vector space Model </li></ul></ul><ul><ul><ul><li>Simple, fast, better retrieval performance </li></ul></ul></ul><ul><ul><li>Extended Boolean Model </li></ul></ul><ul><ul><ul><li>Combine Boolean query formulations with characteristics for the vector model </li></ul></ul></ul>
6. 6. Extended Boolean Model (Cont.) <ul><li>The Model is based on the Critique of a basic assumption of Boolean logic </li></ul><ul><ul><li>Conjunction Boolean query : </li></ul></ul><ul><ul><ul><li>Document which contains either the term k x or the term k y is as irrelevant as another document which contains neither of them </li></ul></ul></ul><ul><ul><li>Disjunction Boolean query : </li></ul></ul><ul><ul><ul><li>Document which contains either the term k x or the term k y is as relevant as another document which contains both of them </li></ul></ul></ul>
7. 7. Extended Boolean Model (Cont.) <ul><li>When only two terms are considered, queries and documents are plotted in a two dimensional map </li></ul>k x and k y d j d j+1 k y k x (1,0) (0,1) (0,0) (1,1) ` d j d j+1 k x or k y (1,0) (0,0) (1,1) (0,1) k y k x
8. 8. Extended Boolean Model (Cont.) <ul><li>Disjunctive query : </li></ul><ul><ul><li>Point (0,0) is the spot to be avoided </li></ul></ul><ul><ul><li>Measure of similarity </li></ul></ul><ul><ul><ul><li>Distance from the point (0,0) </li></ul></ul></ul><ul><li>Conjunctive query : </li></ul><ul><ul><li>Point (1,1) is the most desirable spot </li></ul></ul><ul><ul><li>Measure of similarity </li></ul></ul><ul><ul><ul><li>Complement of the distance from the point (1,1) </li></ul></ul></ul>
9. 9. Extended Boolean Model (Cont.) <ul><li>P-norm Model </li></ul><ul><ul><li>Generalizes the notion of distance to include not only Euclidean distance but also p -distances </li></ul></ul><ul><ul><li>p value is specified at query time </li></ul></ul><ul><ul><li>Generalized disjunctive query </li></ul></ul><ul><ul><li>Generalized conjunctive query </li></ul></ul>
10. 10. Extended Boolean Model (Cont.) <ul><li>P-norm Model query-document similarity </li></ul><ul><li>Example </li></ul>
11. 11. 2.7 Alternative Algebraic Models <ul><li>Generalized Vector Space Model </li></ul><ul><li>Latent Semantic Indexing Model </li></ul><ul><li>Neural Network Model </li></ul>
12. 12. 2.7.1 Generalized Vector Space Model <ul><li>Three classic models </li></ul><ul><ul><li>Assume independence of index terms </li></ul></ul><ul><li>Generalized vector space model </li></ul><ul><ul><li>Index term vectors are assumed linearly independent but are not pairwise orthogonal </li></ul></ul><ul><ul><li>Co-occurrence of index terms inside documents in the collection induces dependencies among these index terms </li></ul></ul><ul><ul><li>Document ranking is based on the combination of the standard term-document weights with the term-term correlation factors </li></ul></ul>
13. 13. 2.7.2 Latent Semantic Indexing Model <ul><li>Motivation </li></ul><ul><ul><li>Problem of lexical matching method </li></ul></ul><ul><ul><ul><li>There are many ways to express a given concept ( synonymy ) </li></ul></ul></ul><ul><ul><ul><ul><li>Relevant documents which are not indexed by any of the query keywords are not retrieved </li></ul></ul></ul></ul><ul><ul><ul><li>Most words have multiple meanings ( polysemy ) </li></ul></ul></ul><ul><ul><ul><ul><li>Many unrelated documents might be included in the answer set </li></ul></ul></ul></ul><ul><li>Idea </li></ul><ul><ul><li>Map each document and query vector into a lower dimensional space which is associated with concepts </li></ul></ul><ul><ul><ul><li>Can be done by Singular Value Decomposition </li></ul></ul></ul>
14. 14. 2.7.3 Neural Network Model <ul><li>Motivation </li></ul><ul><ul><li>In a conventional IR system, </li></ul></ul><ul><ul><ul><li>Document vectors are compared with query vectors for the computation of a ranking </li></ul></ul></ul><ul><ul><ul><li>Index terms in documents and queries have to be matched and weighted for computing this ranking </li></ul></ul></ul><ul><ul><li>Neural networks are known to be good pattern matchers and can be an alternative IR model </li></ul></ul><ul><ul><li>Neural networks is a simplified graph representation of the mesh of interconnected neurons in human brain </li></ul></ul><ul><ul><ul><li>Node: processing unit, edge: synaptic connections </li></ul></ul></ul><ul><ul><ul><li>Weight: strength of connection, </li></ul></ul></ul><ul><ul><ul><li>Spread activation </li></ul></ul></ul>
15. 15. Neural Network Model (Cont.) <ul><li>Three layers </li></ul><ul><ul><li>query terms, document terms, documents </li></ul></ul><ul><li>Spread activation process </li></ul><ul><ul><li>At the first phase: the query term nodes initiate the process by sending signals to the document term nodes , and then the document term nodes generate signals to the document nodes </li></ul></ul><ul><ul><li>The document nodes generate new signals back to the document term nodes , and then the document term nodes again fire new signals to the document nodes (repeat this process) </li></ul></ul><ul><ul><li>Signals become weaker at each iteration and the process eventually halts </li></ul></ul>
16. 16. Neural Network Model (Cont.) <ul><li>Example </li></ul><ul><ul><li>D1 </li></ul></ul><ul><ul><ul><li>Cats and dogs eat. </li></ul></ul></ul><ul><ul><li>D2 </li></ul></ul><ul><ul><ul><li>The dog has a mouse </li></ul></ul></ul><ul><ul><li>D3 </li></ul></ul><ul><ul><ul><li>Mice eat anything </li></ul></ul></ul><ul><ul><li>D4 </li></ul></ul><ul><ul><ul><li>Cats play with mice and rats </li></ul></ul></ul><ul><ul><li>D5 </li></ul></ul><ul><ul><ul><li>Cats play with rats </li></ul></ul></ul><ul><ul><li>Query </li></ul></ul><ul><ul><ul><li>Do cats play with mice? </li></ul></ul></ul>
17. 17. 2.8 Alternative Probabilistic Models <ul><li>Bayesian Networks </li></ul><ul><li>Inference Network Model </li></ul><ul><li>Belief Network Model </li></ul>
18. 18. 2.8.1 Bayesian Networks <ul><li>Bayesian networks are directed acyclic graphs(DAGs) </li></ul><ul><ul><li>node : random variables </li></ul></ul><ul><ul><ul><li>The parents of a node are those judged to be direct causes for it. </li></ul></ul></ul><ul><ul><li>arcs : causal relationships bet’n variables </li></ul></ul><ul><ul><ul><li>The strengths of causal influences are expressed by conditional probabilities. </li></ul></ul></ul>x 1 x 2 x 3 x 4 x 5
19. 19. 2.8.2 Inference Network Model <ul><li>Use evidential reasoning to estimate the probability that a document will be relevant to a query </li></ul><ul><li>The ranking of a document d j with respect to a query q is a measure of how much evidential support the observation of d j provides to the query q </li></ul>
20. 20. Inference Network Model(Cont.) <ul><li>Simple inference Networks </li></ul>A B C D E X Y F
21. 21. Inference Network Model(Cont.) <ul><li>Link Matrices </li></ul><ul><ul><li>Indicate the strength by which parents (either by themselves or in conjunction with other parents) affect children in the inference network </li></ul></ul>0.95 0.8 0.2 0.1 Y false 0.05 0.2 0.8 0.9 Y true P(D)=0.8 P(E)=0.4
22. 22. Inference Network Model(Cont.) <ul><li>Inference Network Example </li></ul><ul><ul><li>Three Layers: document layer, term layer, and query layer </li></ul></ul><ul><ul><li>Documents are represented as nodes, and a link exists from a document to a term. </li></ul></ul>t 1 t 3 t 4 t 2 D1 D2 D3 Q t 2 t 3 d 1 d 2 d 3 t 1 t 2 t 3 t 4 Q Document Layer Concept Layer Query Layer
23. 23. Inference Network Model(Cont.) <ul><li>Relevance Ranking with Inference Network </li></ul><ul><ul><li>Processing begins when a document, say D 1 , is instantiated (we believe D 1 has been observed) </li></ul></ul><ul><ul><li>This instantiates all term nodes in D 1 </li></ul></ul><ul><ul><li>All links emanate from the term nodes just activated are instantiated , and a query node is activated </li></ul></ul><ul><ul><li>The query node then computes the belief in the query given D 1 This is used as the similarity coefficient for D 1 </li></ul></ul><ul><ul><li>This process continues until all documents are instantiated </li></ul></ul>
24. 24. Inference Network Model(Cont.) <ul><li>Example of computing similarity coefficient </li></ul>Q : “gold silver truck” D 1 : “Shipment of gold damaged in a fire.” D 2 : “Delivery of silver arrived in a silver truck.” D 3 : “Shipment of gold arrived in a truck.” 1 1 0 1 1 1 0 0 0 1 1 D3 0.5 0 1 0.5 0.5 0 0 0.5 0 0.5 0.5 D2 0 1 0 1 1 1 1 0 1 0 1 D1 0.37 0.37 0.37 0 0 0.37 1 1 1 0.37 0 nidf 0.41 0.41 0.41 0 0 0.41 1.10 1.10 1.10 0.41 0 idf t 11 t 10 t 9 t 8 t 7 t 6 t 5 t 4 t 3 t 2 t 1
25. 25. Inference Network Model(Cont.) <ul><li>Constructing Link Matrix for Terms </li></ul><ul><ul><li>Computing the belief in a given term (k i ) </li></ul></ul><ul><ul><ul><li>Given a document (d j ) </li></ul></ul></ul><ul><ul><ul><li>P ij = 0.5 + 0.5(ntf ij )(nidf i ) </li></ul></ul></ul><ul><ul><ul><li>P gold3 = 0.5 + 0.5(0.37)(1) = 0.685 </li></ul></ul></ul><ul><ul><li>Link Matrix </li></ul></ul>0.685 0.685 0 True 0.315 0.315 1 False D1 D3 D1 D3 D1 D3 gold 0.685 True 0.315 False D2 silver 0.592 0.685 0 True 0.408 0.315 1 False D2 D3 D2 D3 D2 D3 truck
26. 26. Inference Network Model(Cont.) <ul><li>Computing Similarity Coefficient </li></ul><ul><ul><li>A link matrix for a query node </li></ul></ul><ul><ul><li>bel(gold|D 1 ) = 0.685, bel(silver|D 1 ) = 0, bel(truck|D 1 ) = 0, </li></ul></ul><ul><ul><li>Bel(Q|D 1 ) = 0.1(0.315)(1)(1) + 0.3(0.685)(1)(1) + 0.3(0.315)(0)(1) + 0.5(0.685)(0)(1) + 0.5(0.315)(1)(0) + 0.7(0.685)(1)(0) + 0.7(0.315)(0)(0) + 0.9(0.685)(0)(0) = 0.237 </li></ul></ul><ul><ul><li>bel(gold|D 2 ) = 0, bel(silver|D 2 ) = 0.685, bel(truck|D 2 ) = 0.592, Bel(Q|D 2 ) = 0.589 </li></ul></ul><ul><ul><li>bel(gold|D 3 ) = 0.685, bel(silver|D 3 ) = 0, bel(truck|D 3 ) = 0.685, Bel(Q|D 3 ) = 0.511 </li></ul></ul>0.5 0.5 t 0.7 0.3 gt 0.5 0.5 gs 0.7 0.3 st 0.1 0.9 gst 0.9 0.3 0.3 True 0.1 0.7 0.7 False gst s g