Successfully reported this slideshow.
Your SlideShare is downloading. ×

Computational Humor: Can a machine have a sense of humor? (2020)

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 99 Ad

Computational Humor: Can a machine have a sense of humor? (2020)

Download to read offline

Can computers have a sense of humor? In this talk, we discuss some dimensions of computational humor, where the research field stands and showcase some of Thomas Winters' work in this field.

This talk has been given multiple times by Thomas Winters, in particular on the 11th of December 2020 as a DTAI seminar (KU Leuven's Declarative Languages and Artificial Intelligence Research lab)

More information of this talk on https://thomaswinters.be/talk/2020dtai

Abstract: Can a machine have a sense of humor? At first glance, this question may seem paradoxical, given that humor is an intrinsically human trait. By limiting the scope to specific types of jokes and by hand-coding rules for them, researchers generally have been able to create several methods for detecting and generating humor. Recently, large scale pre-trained language models like BERT, GPT-2/GPT-3 and our own Dutch RobBERT model opened the way for learning even better insights into humor. In this talk, we provide an overview of the history of computational humor, discuss several types of humor tasks that have been automated using artificial intelligence, illustrate several useful applications of computational humor and position several of our own research projects in this field.

Can computers have a sense of humor? In this talk, we discuss some dimensions of computational humor, where the research field stands and showcase some of Thomas Winters' work in this field.

This talk has been given multiple times by Thomas Winters, in particular on the 11th of December 2020 as a DTAI seminar (KU Leuven's Declarative Languages and Artificial Intelligence Research lab)

More information of this talk on https://thomaswinters.be/talk/2020dtai

Abstract: Can a machine have a sense of humor? At first glance, this question may seem paradoxical, given that humor is an intrinsically human trait. By limiting the scope to specific types of jokes and by hand-coding rules for them, researchers generally have been able to create several methods for detecting and generating humor. Recently, large scale pre-trained language models like BERT, GPT-2/GPT-3 and our own Dutch RobBERT model opened the way for learning even better insights into humor. In this talk, we provide an overview of the history of computational humor, discuss several types of humor tasks that have been automated using artificial intelligence, illustrate several useful applications of computational humor and position several of our own research projects in this field.

Advertisement
Advertisement

More Related Content

Similar to Computational Humor: Can a machine have a sense of humor? (2020) (20)

Advertisement
Advertisement

Computational Humor: Can a machine have a sense of humor? (2020)

  1. 1. 1 Computational Humor Thomas Winters PhD Student at KU Leuven & FWO Fellow @thomas_wint thomaswinters.be Can a machine have a sense of humor?
  2. 2. 2 Can a machine have a sense of humor?
  3. 3. 3 Part 1: Humor Intrinsically human!
  4. 4. 4 Every known human civilisation has some form of humor Caron, J.E.: From ethology to aesthetics: Evolution as a theoretical paradigm for research on laughter, humor, and other comic phenomena. Humor15(3), 245–281(2002)
  5. 5. 5 Purpose of Humor = Sign of Intelligence?
  6. 6. 6 Purpose of Humor = Sign of Intelligence? huh?
  7. 7. 7 Purpose of Humor = Sign of Intelligence? huh? aha!
  8. 8. 8 Purpose of Humor = Sign of Intelligence? huh? aha! that’s funny
  9. 9. 9 Purpose of Humor = Sign of Intelligence? huh? aha! that’s funny h
  10. 10. 10 Purpose of Humor = Sign of Intelligence? huh? aha! that’s funny Brain is rewarding noticing incongruities and successfully resolving them h
  11. 11. 11 Purpose of Humor = Sign of Intelligence? huh? aha! that’s funny Brain is rewarding noticing incongruities and successfully resolving them + linguistic skills (required to create jokes) h
  12. 12. 12 Purpose of Humor = Sign of Intelligence? huh? aha! that’s funny Brain is rewarding noticing incongruities and successfully resolving them + linguistic skills (required to create jokes) = Evolutionary advantage! h
  13. 13. 13 How does humor work?
  14. 14. 14 Incongruity-Resolution Theory Based on: Ritchie, G. (1999). Developing the incongruity-resolution theory. Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?”
  15. 15. 15 Incongruity-Resolution Theory Based on: Ritchie, G. (1999). Developing the incongruity-resolution theory. Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Setup
  16. 16. 16 Incongruity-Resolution Theory Based on: Ritchie, G. (1999). Developing the incongruity-resolution theory. Obvious Interpretation Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Setup
  17. 17. 17 Incongruity-Resolution Theory Based on: Ritchie, G. (1999). Developing the incongruity-resolution theory. Obvious Interpretation Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Setup Punchline
  18. 18. 18 Incongruity-Resolution Theory Based on: Ritchie, G. (1999). Developing the incongruity-resolution theory. Obvious Interpretation Hidden Interpretation Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Setup Punchline
  19. 19. 19 Human-focused definition! Machine should not only spot two mental images Obvious Interpretation Hidden Interpretation But also this is not too hard or too easy for a human!
  20. 20. 20 Humor = AI-complete? ~ as hard as making computers as “intelligent” as people
  21. 21. 21 Part 2: Humor Generation
  22. 22. 22 Hard-coding a sense of humor
  23. 23. 23 JAPE Joke Examples What is the difference between leaves and a car? One you brush and rake, the other you rush and brake What do you call a strange market? A bizarre bazaar. Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  24. 24. 24 JAPE: Templates & Schemas Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  25. 25. 25 JAPE: Templates & Schemas What’s <CharacteristicNP> and <Characteristic1> ? A <Word1> <Word2>. Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  26. 26. 26 JAPE: Templates & Schemas What’s <CharacteristicNP> and <Characteristic1> ? A <Word1> <Word2>. Noun Phrase Word1 Word2 Homophone1 Characteristic1 CharacteristicNP Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  27. 27. 27 JAPE: Templates & Schemas What’s <CharacteristicNP> and <Characteristic1> ? A <Word1> <Word2>. Noun Phrase Word1 Word2 Homophone1 Characteristic1 CharacteristicNP spring (season) to bounce spring (elastic body) cabbage green spring cabbage Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  28. 28. 28 JAPE: Templates & Schemas What’s <CharacteristicNP> and <Characteristic1> ? A <Word1> <Word2>. Noun Phrase Word1 Word2 Homophone1 Characteristic1 CharacteristicNP What’s green and bounces? A spring cabbage. spring (season) to bounce spring (elastic body) cabbage green spring cabbage Binsted, K., & Ritchie, G. (1994). An implemented model of punning riddles
  29. 29. 29 Unsupervised Analogy Generator Petrović, S., & Matthews, D. (2013). Unsupervised joke generation from big data
  30. 30. 30 Unsupervised Analogy Generator I like my <X> like I like my <Y>: <Z> Petrović, S., & Matthews, D. (2013). Unsupervised joke generation from big data
  31. 31. 31 Unsupervised Analogy Generator I like my <X> like I like my <Y>: <Z> Petrović, S., & Matthews, D. (2013). Unsupervised joke generation from big data Z Y X Maximise adjective dissimilarity Maximise co-occurrence Maximise co-occurrence Maximise # definitions Minimize commonness
  32. 32. 32 Unsupervised Analogy Generator I like my <X> like I like my <Y>: <Z> Petrović, S., & Matthews, D. (2013). Unsupervised joke generation from big data cold war coffee Z Y X Maximise adjective dissimilarity Maximise co-occurrence Maximise co-occurrence Maximise # definitions Minimize commonness
  33. 33. 33 Unsupervised Analogy Generator I like my <X> like I like my <Y>: <Z> I like my coffee like I like my war: cold. Petrović, S., & Matthews, D. (2013). Unsupervised joke generation from big data cold war coffee Z Y X Maximise adjective dissimilarity Maximise co-occurrence Maximise co-occurrence Maximise # definitions Minimize commonness
  34. 34. 34 Talk Generator Generates nonsense PowerPoints about any given topic Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks Try it yourself: talkgenerator.com
  35. 35. 35 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  36. 36. 36 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  37. 37. 37 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  38. 38. 38 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  39. 39. 39 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  40. 40. 40 Talk Generator Try it yourself: talkgenerator.com Winters T., Mathewson K. (2019). Automatically Generating Engaging Presentation Slide Decks
  41. 41. 41 Templates + Grammar + Functions = Babbly Created programming language for templates text generation with grammars for more variation: Babbly import firstname.words food = pasta|pizza|sushi main = { 3: <firstname> loves <food.uppercase>! 1: <firstname> (quite|reasonably|fairly) likes <food>. Oo{1,3}h, I hope they join! 1: <firstname:protagonist> is not (quite){.5} fond of <food:>. <firstname:protagonist> will thus not go to the <food:> (restaurant|place). } Winters, T. (2019). Modelling Mutually Interactive Fictional Character Conversational Agents
  42. 42. 42 E.g. to easily program Samson Twitterbot Winters, T. (2019). Modelling Mutually Interactive Fictional Character Conversational Agents
  43. 43. 43 Learning a sense of humor?
  44. 44. 44 Transformer models Large language models, pretrained on large corpora Outperforming previous neural architectures on most language tasks GPT-2 & GPT-3 Completes any textual prompt BERT Classifies any text sequence / token Brown, Tom B., et al. "Language models are few-shot learners." Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding
  45. 45. 45 Not just for English! Phase 1: Pretraining: • Find unlabeled corpus for target language • Expensive pretraining phase • BUT: reusable! Just download existing! Phase 2: Finetuning a pre-trained BERT • Feed labeled training data • Relatively cheap fine-tuning for the target task • Usually outperforms other language models on most NLP tasks Delobelle, P., Winters, T., & Berendt, B. (2020). RobBERT: a dutch RoBERTa-based language model. RobBERT Our Dutch BERT-like model
  46. 46. 46 GPT-2 / GPT-3 humor? Tokenizer problem: maps words to numbers  Remove knowledge about letters! https://www.gwern.net/GPT-3 https://towardsdatascience.com/teaching-gpt-2-a-sense-of-humor-fine-tuning-large-transformer-models-on-a-single-gpu-in-pytorch-59e8cec40912 Hard to make/recognize new wordplay! “tank” 28451 “thank” 40716
  47. 47. 47 GPT-2 / GPT-3 joke examples GPT-2 can mimic joke style, but usually very absurd https://www.gwern.net/GPT-3 https://towardsdatascience.com/teaching-gpt-2-a-sense-of-humor-fine-tuning-large-transformer-models-on-a-single-gpu-in-pytorch-59e8cec40912 What did one plate say to the other plate? Dip me! What did the chicken say after he got hit by a bus? "I'm gonna be fine!" A woman walks into the bar......she says to the bartender "I want a double entendre" and the bartender says "No, we don't serve that"
  48. 48. 48 Reminiscent of how children write jokes
  49. 49. 49 Let’s try simpler & more understandable models!
  50. 50. 50 Dynamic Templates Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot
  51. 51. 51 Dynamic Templates “Are there also atheists that don't believe in atheism?” “They see the fact that the former God is not trying to deny this newly acquired insight as proof of them being right. Even with the Church, things are not going well. Norse popes.” Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot
  52. 52. 52 Dynamic Templates “Are there also atheists that don't believe in atheism?” “They see the fact that the former God is not trying to deny this newly acquired insight as proof of them being right. Even with the Church, things are not going well. Norse popes.” Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot
  53. 53. 53 Dynamic Templates “Are there also atheists that don't believe in atheism?” “They see the fact that the former God is not trying to deny this newly acquired insight as proof of them being right. Even with the Church, things are not going well. Norse popes.” Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot Replace some matching words with matching POS tag with lowest frequencies
  54. 54. 54 Dynamic Templates “Are there also atheists that don't believe in atheism?” “They see the fact that the former God is not trying to deny this newly acquired insight as proof of them being right. Even with the Church, things are not going well. Norse popes.” Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot Replace some matching words with matching POS tag with lowest frequencies
  55. 55. 55 Dynamic Templates “Are there also atheists that don't believe in atheism?” “They see the fact that the former God is not trying to deny this newly acquired insight as proof of them being right. Even with the Church, things are not going well. Norse popes.” “Are there also popes that don't believe in God?” Winters, T. (2019). Generating philosophical statements using interpolated markov models and dynamic templates. Used in @TorfsBot Replace some matching words with matching POS tag with lowest frequencies
  56. 56. 56 GITTA: Template Trees for extracting templates Winters, T., & De Raedt, L. (2020). Discovering Textual Structures: Generative Grammar Induction using Template Trees. Input
  57. 57. 57 GITTA: Template Trees for extracting templates 1. Join closest strings 2. Merge similar template slots 3. Iteratively simplify until convergence Winters, T., & De Raedt, L. (2020). Discovering Textual Structures: Generative Grammar Induction using Template Trees. Input
  58. 58. 58 GITTA: Template Trees for extracting templates 1. Join closest strings 2. Merge similar template slots 3. Iteratively simplify until convergence Winters, T., & De Raedt, L. (2020). Discovering Textual Structures: Generative Grammar Induction using Template Trees. Input
  59. 59. 59 GITTA: Template Trees for extracting templates 1. Join closest strings 2. Merge similar template slots 3. Iteratively simplify until convergence A: I like my <B> and my <B> | <G> the <B> is <F> B: chicken | cat | dog F: walking | jumping G: Alice | Bob | Cathy Winters, T., & De Raedt, L. (2020). Discovering Textual Structures: Generative Grammar Induction using Template Trees. Input
  60. 60. 60 Generalising Schemas to Enable Machine Learning Constraint-based Schemas Hard to tweak to preference Metrical Schema Allows for learning in aggregator component Header Keywords Template Features Aggregator Values Generator Humour Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  61. 61. 61 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  62. 62. 62 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes 1. Training a) I like my relations like I like my source, open. b) I like my coffee like I like my war, cold. c) The sailor bears a stress. Pier pressure. d) I like my girlfriend like I like my estate: real. e) The lord cultivates a region. A baron land. GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  63. 63. 63 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes 1. Training Joke Rater 1 Rater 2 Rater 3 a 5 4 5 b 4 4 3 c 5 3 4 d 2 1 3 e 3 2 2 GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  64. 64. 64 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template 1. Training Templates Template 1: “I like my X like I like my Y, Z.” relations, source, open | [5,4,5] coffee, war, cold | [4,4,3] girlfriend, estate, real | [2,1,3] Template 2: ``The A B a C. D.‘’ sailor, bears, stress, Pier pressure | [5,3,4] lord, cultivates, region, A baron land | [3,2,2] GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  65. 65. 65 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) 1. Training Templates X Y Z Score FreqX FreqY FreqZ FreqXY … relations source open [5,4,5] 3 831 210 4 050 904 4884757 0 … coffee war cold [4,4,3] 784 065 9 211 826 2 010 173 0 … girlfriend estate real [2,1,3] 75 392 998 669 5 284 057 0 …. GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  66. 66. 66 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) 1. Training Templates GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  67. 67. 67 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic 1. Training 2. Generating Templates “Men” GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  68. 68. 68 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic Word + Template 1. Training 2. Generating Templates “Men” + “I like my X like I like my Y, Z.” GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  69. 69. 69 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic Word + Template Generated template values 1. Training 2. Generating Templates a) men, grave, nameless b) men, country, cold c) men, turkey, roast d) men, rain, gentle e) men, laugh, silly GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  70. 70. 70 X Y Z Score FreqX FreqY FreqZ FreqXY … men grave nameless ? 9 876 543 4 050 904 4884757 0 … men country cold ? 9 876 543 9 211 826 2 010 173 0 … men turkey roast ? 9 876 543 998 669 5 284 057 0 …. men rain gentle ? 9 876 543 9 876 543 845 321 0 … men laugh silly ? 9 876 543 954 569 123 456 0 …. Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic Word + Template Generated template values Template values + features 1. Training 2. Generating Templates GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  71. 71. 71 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic Word + Template Generated template values Template values + features 1. Training 2. Generating Templates Filtered template values set men, grave, nameless => 5 men, country, cold => 1 men, turkey, roast =>3 men, rain, gentle =>4 men, laugh, silly => 1 GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  72. 72. 72 Human Evaluation Template Extractor Template Store Values Generator Metric-based Rater Classifier/ Regression Input Jokes Rated Jokes Set of rated template values per template Rated template values with feature values (training data) Input topic Word + Template Generated template values Template values + features Jokes about seed 1. Training 2. Generating (Apply Template) Templates Filtered template values set I like my men like I like my grave: nameless I like my men like I like my rain: gentle GOOFER: Generalised framework Winters, T., Nys, V., & De Schreye, D. (2019). Towards a general framework for humor generation from rated examples.
  73. 73. 73 GAG: implements GOOFER framework Winters, T., Nys, V., & De Schreye, D. (2018). Automatic Joke Generation: Learning Humor from Examples
  74. 74. 74 GAG: implements GOOFER framework Winters, T., Nys, V., & De Schreye, D. (2018). Automatic Joke Generation: Learning Humor from Examples
  75. 75. 75 GAG: implements GOOFER framework I like my coffee like I like my sleep: strong. I like my men like I like my graves: nameless. I like my women like I like my tests: independent. Winters, T., Nys, V., & De Schreye, D. (2018). Automatic Joke Generation: Learning Humor from Examples
  76. 76. 76 GAG: implements GOOFER framework 11.41% 22.61% 21.08% GAG: Classifier Human-created on JokeJudger Human-created single-word jokes 4+ RATINGS FREQUENCY Winters, T., Nys, V., & De Schreye, D. (2018). Automatic Joke Generation: Learning Humor from Examples
  77. 77. 77 Need for good humor detection algorithms!
  78. 78. 78 Part 3: Humor Detection
  79. 79. 79 Early Humor Detector • Designed humor features e.g. alliteration, antonym, adult slang... • Used Naive Bayes and Support Vector Machines • Task: One-liners vs news, neutral corpus & proverbs Mihalcea, R., & Strapparava, C. (2005). Making computers laugh: Investigations in automatic humor recognition.
  80. 80. 80 But is this a good dataset? News & proverbs have completely different types of words than jokes!  Looking at word frequencies is often already “enough”! Is this really humor detection?
  81. 81. 81 Jokes are fragile! Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  82. 82. 82 Jokes are fragile! Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” Generate non-jokes using dynamic templates! Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  83. 83. 83 Jokes are fragile! Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” men Generate non-jokes using dynamic templates! Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  84. 84. 84 Jokes are fragile! Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” men bar Generate non-jokes using dynamic templates! Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  85. 85. 85 Jokes are fragile! Two fish are in a tank. Says one to the other: “Do you know how to drive this thing?” men bar Generate non-jokes using dynamic templates! Word-based features won’t work anymore! Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  86. 86. 86 51% 60% 50% 94% 94% 47% 94% 94% 47% 99% 96% 89% Jokes vs News Jokes vs Proverbs Jokes vs Generated Jokes Binary classification of jokes versus texts from other domains Naive Bayes LSTM CNN RobBERT Much more challenging dataset! More truthful humor detection? Winters T., Delobelle P. (2020). Dutch humor detection by generating negative examples. BNAIC/Benelearn2020
  87. 87. 87 GALMET: Genetic Algorithm using Language Models for Evolving Text Created algorithm that evolves headlines into satirical headlines Winters T., Delobelle P. (UNPUBLISHED). Survival of the Wittiest: Evolving Satire with Language Models
  88. 88. 88 Survival of the Wittiest Amazon removes Indian flag doormat after minister threatens visa ban  NASA releases rainbow leg doormat after violating OT ban Winters T., Delobelle P. (UNPUBLISHED). Survival of the Wittiest: Evolving Satire with Language Models
  89. 89. 89 Symbiotic relationship Can help improve Humor Generation Can help improve Humor Detection
  90. 90. 90 Part 4: Why?
  91. 91. 91 Grammarly but for jokes?
  92. 92. 92 VIRTUAL ASSISTANTS People often treat virtual assistants like humans (“please”, “thank you”, “sorry”) [1] [1] https://www.thinkwithgoogle.com/consumer-insights/voice-assistance-consumer-experience/ [2] Laughter’s Influence on the Intimacy of Self-Disclosure, Gray, A.W., Parkinson, B. & Dunbar, R.I. Hum Nat (2015) 26: 28. 41% say it’s like talking to a friend [1] Humor and creativity in language is important between friends [2]
  93. 93. 93 Casual Creation: it’s fun to create these little bots! https://thomaswinters.be/mopjesbot Winters, T. (2019). Generating Dutch Punning Riddles about Current Affairs
  94. 94. 94 Mutually interactive bots Winters, T. (2019). Modelling Mutually Interactive Fictional Character Conversational Agents https://thomaswinters.be/samsonbots
  95. 95. 95 Studying & understanding humor We do not fully know how humor works Artificial Intelligence can shed a light! Humour
  96. 96. 96 Can computers have a sense of humor? Learn rules if enough data Hand-craft rules for type of humor Better NLP models get us closer & closer Humour
  97. 97. 97 BUT: AI-complete problem! Need to process two mental images before fully understanding the joke
  98. 98. 98 Computational humor is still in its infancy But, children have a fascinating sense of humor
  99. 99. 99 Thank you! Some images (based on the works) of dooder on freepik.com Thomas Winters PhD Student at KU Leuven & FWO Fellow @thomas_wint thomaswinters.be

Editor's Notes

  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Haha and Aha are very similar
  • Generated 100 jokes to JokeJudger
    This = 9500 ratings & markings
    First three categories: all jokes

    Classifier > Regression
    In half of the times, the jokes of the computer are equally good as human jokes.
    CAN BE FILTERED BY HUMANS!
  • Generated 100 jokes to JokeJudger
    This = 9500 ratings & markings
    First three categories: all jokes

    Classifier > Regression
    In half of the times, the jokes of the computer are equally good as human jokes.
    CAN BE FILTERED BY HUMANS!
  • Generated 100 jokes to JokeJudger
    This = 9500 ratings & markings
    First three categories: all jokes

    Classifier > Regression
    In half of the times, the jokes of the computer are equally good as human jokes.
    CAN BE FILTERED BY HUMANS!
  • Generated 100 jokes to JokeJudger
    This = 9500 ratings & markings
    First three categories: all jokes

    Classifier > Regression
    In half of the times, the jokes of the computer are equally good as human jokes.
    CAN BE FILTERED BY HUMANS!
  • Why need computational humour? Let’s look at Virtual Assistants

×