Search for Sentiment


Published on

Sentiment analysis meets search: presentation slides for Seth Grimes's talk Search for Sentiment at the Search Engine Meeting, April 27, 2010.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Search for Sentiment

  1. 1. Search for Sentiment<br />Seth Grimes<br />Alta Plana Corporation<br />301-270-0795 --<br />Search Engine Meeting<br />April 27, 2010<br />
  2. 2. Seth Grimes –<br />Principal Consultant with Alta Plana Corporation.<br />Contributing Editor, TechWeb’<br />Channel Expert (text analytics),<br />Founding Chair, Sentiment Analysis Symposium,, and Text Analytics Summit,<br />Twitter: @SethGrimes, #SAS10<br />
  3. 3. Two assertions:<br />Human communications are inherently subjective. <br />Opinion often masquerades as Fact.<br />
  4. 4. Facts and Feelings<br />The unemployment rate is 9.7%.<br />Unemployment is WAY TOO HIGH!!<br />The unemployment rate is higher than it was two years ago (5.1%).<br />Former U.S. Federal Reserve Chairman Alan Greenspan said on Tuesday that the global recession will "surely be the longest and deepest" since the 1930s, adding that the Obama administration's Troubled Asset Relief Program will be insufficient to plug the yawning financial gap. [Reuters, Feb 18, 2009]<br />Bernanke is doing a better job than Greenspan.<br /><br />
  5. 5.
  6. 6. Questions for business & government:<br />What are people saying? What’s hot/trending?<br />What are they saying about {topic|person|product} X?<br />... about X versus {topic|person|product} Y?<br />How has opinion about X and Y evolved?<br />How has opinion correlated with {our|competitors’|general} {news|marketing|sales|events}?<br />What’s behind opinion, the root causes?<br />Who are opinion leaders?<br />How does sentiment propagate across multiple channels?<br />
  7. 7. Is sentiment a search problem?<br />
  8. 8. Information access w/structure, sentiment:<br />User intent?<br />Sentiment<br />Sentiment+<br />
  9. 9. “In this example, you can quickly see that the Drooling Dog Bar B Q has gotten lots of positive reviews, and if you want to see what other people have said about the restaurant, clicking this result is a good choice.”<br />--<br />“In the recap of [Searchology] from Google’s Matt Cutts, he tells us that: ‘If you sort by reviews, Google will perform sentiment analysis and highlight interesting comments.’<br />-- Bill Slawski, “Google's New Review Search Option and Sentiment Analysis,”<br />
  10. 10.
  11. 11. For better information access, understand user intent.<br />User intent?<br />
  12. 12. We have a decision support need. We=<br />Consumers<br />Marketers<br />Competitors<br />Managers<br />Decision support requires tools beyond general-purpose search/information access…<br />
  13. 13. Counting term hits, in one source, at the doc level, doesn’t take you far...<br />Good or bad? What’s behind the posts?<br />
  14. 14. Counting -- clicks, not even keywords -- leaves you wondering Why? and So What?<br />
  15. 15. “Sentiment analysis is the task of identifying positive and negative opinions, emotions, and evaluations.” <br />-- Wilson, Wiebe & Hoffman, 2005, “Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis”<br />“Sentiment analysis or opinion mining is the computational study of opinions, sentiments and emotions expressed in text… An opinion on a feature f is a positive or negative view, attitude, emotion or appraisal on f from an opinion holder.”<br />-- Bing Liu, 2010, “Sentiment Analysis and Subjectivity,” in Handbook of Natural Language Processing<br />
  16. 16. Sentiment analysis turns attitudes into data.<br />Ingredients:<br />Structured and unstructured sources.<br />Subjectivity – WW&H used over 8,000 clues.<br />Polarity: positive, negative, (both,) or neutral.<br />Intensity.<br />
  17. 17. There are many complications. Simplified:<br />Sentiment may be of interest at multiple levels.<br />Corpus / data space, i.e., across multiple sources.<br />Document.<br />Statement / sentence.<br />Entity / topic / concept.<br />Human language is noisy and chaotic!<br />Jargon, slang, irony, ambiguity, anaphora, polysemy, synonymy, etc.<br />Context is key. Discourse analysis comes into play.<br />Must distinguish the sentiment holder from the object: Greenspan said the recession will…<br />
  18. 18. Sentiment sources (broadly):<br />News<br />Social media<br />Enterprise feedback<br />Consumption models:<br />Push<br />Pull (a.k.a. search)<br />General search engine<br />Siloed/vertical search interface<br />Application embedded<br />Widgets/gadgets<br />
  19. 19.
  20. 20. Rated negative?<br />
  21. 21. Manual focus<br />???<br />
  22. 22. An accuracy aside: [WWH 2005] describes an inter-annotator agreement test.<br />10 documents w/ 447 subjective expressions.The two annotators agree on 82% of cases.<br />Excluding of uncertain subjective expressions (18%) boosts agreement to 90%.<br />(Wilson, Wiebe & Hoffman, 2005, “Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis”)<br />
  23. 23. Claim: You fall far short with (only) --<br />Doc-level analysis.<br />Keyword-based analysis.<br />For text, you need strong natural language processing (NLP) for information extraction:<br />“A direct opinion is a quintuple (oj, fjk, ooijkl, hi, tl), where oj is an object, fjk is a feature of the object oj, ooijkl is the orientation or polarity of the opinion on feature fjk of object oj, hi is the opinion holder and tl is the time when the opinion is expressed by hi.” [Liu 2010]<br />… index at will!<br />
  24. 24. Boost accuracy via ratings & classification:<br />
  25. 25. Next slides have a few more examples.<br />A Jodangeembeddable “gadget.”<br />, a now defunct media portal from the Financial Times Group.<br />
  26. 26.
  27. 27.
  28. 28. Beyond polarity: “We present a system that adds an emotional dimension to an activity that Internet users engage in frequently, search..” <br />-- Sood& Vasserman & Hoffman, 2009, “ESSE: Exploring Mood on the Web”<br />
  29. 29. HappySadAngry<br />Energetic Confused Aggravated<br />Bouncy Crappy Angry<br />Happy Crushed Bitchy<br />Hyper Depressed Enraged<br />Cheerful Distressed Infuriated<br />Ecstatic Envious Irate<br />Excited Gloomy Pissed off<br />Jubilant Guilty<br />Giddy Intimidated<br />Giggly Jealous<br />Lonely<br />Rejected<br />Sad<br />Scared<br />-----------------------<br />The three prominent mood groups that emerged from K-Means Clustering on the set of LiveJournalmood labels.<br />
  30. 30. Questions?<br />Comments?<br />
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.