Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Upick

305 views

Published on

The presentation is regarding an approach proposed to filter the named entity relations extracted from automated systems with the help of humans.

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

Upick

  1. 1. AgendaIntroduction to Named Entities (NE) concepts.Extracting relationships among NEs andproblems associated with them.Our approach: Human in the loopProposed design and results
  2. 2. Terminology Named entities Relations Co-references
  3. 3. Named Entity: Definition It is an atomic element in a body of text. Types: person, organization, location etc. Different named entities when linked together, form a relation.
  4. 4. Named Entity: An example Sachin Tendulkarwas born in Bombay. NE of type ‘Person’ NE of type ‘Location’
  5. 5. Relationship: Structure Subject – Relation - Object NE of any type NE of any type Verb, Adjective, Adverb
  6. 6. Relationship: An ExampleSachin Tendulkar was born inBombay Subject Relation Object
  7. 7. Co-references: An ExampleSachin was born in Bombay. He is a ...
  8. 8. Extracting relationships amongNEs: Importance They signify a fact related to a named entity. Useful in Question answering system. Useful in improving the accuracy of search results.
  9. 9. Extracting relationships amongNEs: Standard process1. Identify named entities within a sentence.2. Find the verb or adjective that connects the identified named entities.3. Connect them together to form relation.
  10. 10. Extracting relationships amongNEs: Difficulty Co-references, Use of abbreviations, acronyms and ambiguous words at several places. Complex structure of the sentences.
  11. 11. Extracting relationships amongNEs: Difficulty Example“Tom called his father last night. They talked for an hour. He said he would be home the next day." What is ‘He referring to? Tom orhis father?
  12. 12. Extracting relationships amongNEs: Required process1. Identify part-of-speech constructs: noun, verb, adjective etc.2. Determine Co-references, Acronyms and abbreviations.3. Connect them together to form a relationship.
  13. 13. Extracting relationships amongNEs: Automated Approaches Natural Language Processing: Part-of-Speech Tagger, CRF. Machine Learning: Hidden Markov Model, Singular Vector Model. Statistical Methods: Maximum Entropy Method. Other methods:Vocabularybased systems, context based clustering.
  14. 14. Issues with the automatedextraction techniques Dependency Scalability on external vocabulary Domain-dependent. sources, like Wikipedia, Corpus-dependent. WordNet, MindNet etc. Relation specific. Maintenance, update of vocabulary sources is manual, costly and require expertise. Limited size produce context based noise.
  15. 15. Crowdsourcing: harness thewisdom of the crowd From Traditional to Human Computation
  16. 16. Crowdsourcing: In terms of NErelationship extraction Advantages DisadvantagesHumans can easily extract Not like computers.and find different facts froma text body. Find the task boring and cumbersome.They can also verify theaccuracy of the obtained Need incentives torelationships from participate.automated techniques.
  17. 17. Incentive mechanisms Money IncentivesFun Social
  18. 18. 1. Monetary incentives Features Disadvantages Can scale massively. Not the only source of motivation. Harness majority vote. Work is not credited so may Example: encourage cheating. Requires filtering and Amazon Mechanical Turk monitoring. Needs labor law.
  19. 19. 2. Social incentives Features Disadvantages Distribute among social Do not scale beyond the peers (crowd). closed network. Harness trust. Specific to the participating crowd. Examples: Requires filtering andFlickr, Facebook, Quora. monitoring.
  20. 20. 3. Fun incentives Features Disadvantages Games are seductive. Need someone to play with. Bring collaboration, Improper game play may curiosity, challenges, encourage cheating. competition, fun. More cognitive work leads Task is generally hidden. to less fun. Example: Requires filtering and ESP, GWAP monitoring.
  21. 21. Existing crowdsourcing ideas Monetary incentive Fun incentiveAmazon Mechanical Turk Games With a purpose: Verbosity, Categorilla,collecting named entities, Phrase Detectivesfinding relational hierarchy,phrase detection etc. For collecting common sense facts, producingStill no solution for entities for templates.verification! Still a higher cognitive task!
  22. 22. uPick: automated techniques +human intelligence
  23. 23. uPick architecture
  24. 24. uPick working Step 1: Extract NEs and relations using POS Tagger (automated technique). Step 2:Present the extracted relations to a crowd in the form a game (challenge). Step 3: Filter the relations by collecting the majority votes.
  25. 25. uPick: Game inaction
  26. 26. uPick scoring For the first player, compare the output with the expert judgments. For subsequent players, check the majority vote (> 50%).
  27. 27. uPick benefits Effectiveness Generalization Min. cognitive effort Language and corpus because of click-based independent. interaction. Can be extended to solve No dependency on external other similar NLP problems. resource, therefore, scalable.
  28. 28. Supervised laboratory study. Participants: 12 (4 maleUser study and 8 females). Two sessions of one of hour: training and game play. uPick Document setfour onAshok Maurya, Sachin Tendulkar, Shahrukh Khan, and Sonia Gandhi.
  29. 29. D1 D2 D3 D4Total number of presented 37 39 40 33relationsCorrectly identified valid 19 18 19 15relationsIncorrectly identified valid 5 6 4 1relations as invalidCorrectly identified invalid 12 12 16 15relationsIncorrectly identified 1 3 1 2invalid relations as validAccuracy 84% 77% 87% 91%(Correctly identifiedrelations / total relations)Accuracy using automated techniques 65% 61% 57% 49%only (Valid relations / total relations) RESULTS: Accuracy of uPick scheme after considering majority votes of the participants
  30. 30. Conclusion Participants did not find the game design engaging. uPick proved helpful in remembering various facts related to a text body.
  31. 31. Future Work Leader board. More engaging game play design. For example, physics based puzzles and object finding games. Extension to question answering system based on individual document.

×