SlideShare a Scribd company logo
1 of 26
Yunchao He (何云超)
2015.9.15 @ Yuan Ze University
 “unbelievably disappointing”
 “Full of zany characters and richly applied satire, and some great plot twists”
 “this is the greatest screwball comedy ever filmed”
 “It was pathetic. The worst part about it was the boxing scenes.”
 Sentiment Analysis
 Using NLP, statistics, or machine learning methods to extract, identify, or otherwise
characterize the sentiment content of a text unit
 Sometimes called opinion mining, although the emphasis in this case is on extraction
 Other names: Opinion extraction、Sentiment mining、Subjectivity analysis
2
3
 Movie: is this review positive or negative?
 Products: what do people think about the new iPhone?
 Public sentiment: how is consumer confidence? Is despair increasing?
 Politics: what do people think about this candidate or issue?
 Prediction: predict election outcomes or market trends from sentiment
4
 Short text classification based on Semantic clustering
 Sentiment intensity prediction using CNN
 Transfer Learning*
* Future works 5
 People express opinions in complex ways
 In opinion texts, lexical content alone can be misleading
 Intra-textual and sub-sentential reversals, negation, topic change common
 Rhetorical devices such as sarcasm, irony, implication, etc.
6
 Tokenization
 Feature Extraction: n-grams, semantics, syntactic, etc.
 Classification using different classifiers
 Naïve Bayes
 MaxEnt
 SVM
 Drawback
 Feature Sparsity
S1: I really like this movie
[...0 0 1 1 1 1 1 0 0 ... ]
8
S1: This phone has a good keypad
S2: He will move and leave her for good
 Using clustering algorithm to aggregate short text to form big clusters, in which
each cluster has the same topic and the same sentiment polarity, to reduce the
sparsity of short text representation and keep interpretation.
S1: it works perfectly! Love this product
S2: very pleased! Super easy to, I love it
S3: I recommend it
it works perfectly love this product very pleased super easy to I recommend
S1: [1 1 1 1 1 1 0 0 0 0 0 0 0]
S2: [0 0 0 1 0 0 1 1 1 1 1 1 0]
S3: [1 0 0 0 0 0 0 0 0 0 0 1 1]
S1+S2+S3: [...0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0...]
9
 Training data labeled with positive and negative polarity
 K-means clustering algorithm is used to cluster positive and negative text
separately.
 K-means, KNN, LDA…
works perfectly! Love this product
completely useless, return policy
very pleased! Super easy to, I am pleased
was very poor, it has failed
highly recommend it, high recommended!
it totally unacceptable, is so bad
works perfectly! Love this product
very pleased! Super easy to, I am pleased
highly recommend it, high recommended!
completely useless, return policy
was very poor, it has failed
it totally unacceptable, is so bad
Topical clusters
10
Classifier: Multinomial Naive Bayes
Probabilistic classifier: get the probability of label given a clustered text
,
1
arg max ( | )
arg max ( ) ( | )
Ci
i
s S
i j
s S j N
s P s C
P s P C s

  

 
$
( ) sN
P s
N

,
,
( , ) 1
( | )
( | ) | |
i j
i j
x V
N C s
P C s
N x s V




Bayes’ theory
Independent assumption
11
 Given an unlabeled text , we use Euclidean distance to find the most similar
positive cluster , and the most similar negative cluster
 The sentiment of , is estimated according to the probabilistic change of the
two clusters when merging with . (vs. KNN)
 This merging operation is called two-stage-merging method, as each
unlabeled text will be merged two times.
0, | ( ) ( ) | | ( ) ( ) |
( )
1, .
m m n n
j
P NC P C P NC P C
f x
otherwise
   
   
 

mC 
jx
nC 
jx
jx
12
 Dataset: Stanford Twitter Sentiment Corpus (STS)
 Baseline: bag-of-unigrams and bigrams without clustering
 Evaluation Metrics: accuracy, precision, recall
 The average precision and accuracy is 1.7% and 1.3% higher than the baseline
method.
Methods Accuracy Precision Recall
Our Method 0.816 0.82 0.813
Bigrams 0.805 0.807 0.802
13
 Continuous sentiment intensity provides fine-grained representation of sentiment.
 Representing sentiment as Valence-arousal can easily convert to discrete categories.
“unbelievably disappointing” Model
V: -0.5
A: 0.3
15
 Lexicon based Method. To find the relationship between word-level and sentence-level
sentiment values. Word-level information comes from sentiment lexicon, e.g. ANEW.
 Paltoglou 2013: Weighted Arithmetic Mean、Weighted Geometric Mean
 Malandrakis 2013: linear regression
Paltoglou, G., Theunis, M., Kappas, A., & Thelwall, M. (2013). Predicting emotional responses to long informal text. Affective Computing, IEEE Transactions on, 4(1), 106-115.
Malandrakis, N., Potamianos, A., Iosif, E., & Narayanan, S. (2013). Distributional semantic models for affective text analysis. Audio, Speech, and Language Processing, IEEE
Transactions on, 21(11), 2379-2392.
16
 To find the relationship between words and sentence-level sentiment.
CNN Method Lexicon-based Methods
Word Dense vector VA value
Relationship Auto learned Manually specified
Training data Many Few or None
Word Order Considered* Not Considered
Interpretation Black Box Easy
17
 To find the relationship between words and sentence-level sentiment.
 Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression
 Word Representation: dense vector, distributed representation
我们的
心
不像
明镜
不可以
美丑
善恶
全部
包容
boat
ship
vessel
good
happy
Beijing
Shanghai
glad
Semantic information of word is encoded in the dense vector. 18
 Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression
我们的
心
不像
明镜
不可以
美丑
善恶
全部
包容Sentence
Matrix
[ : 1,:]( )i i i mc f w S b   
 Dimension Reduced
 Reduce the parameters of the model
 Parameter sharing
f: Activation function, Relu, tanh, sigmoid, …
𝑓 𝑥 = max(0, 𝑥)
19
 Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression
 Aggregate the information and capture the most important features
我们的
心
不像
明镜
不可以
美丑
善恶
全部
包容
3 6 79 7 54
79 9
Max pool with 5×1 filters and stride 1
20
 Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression
我们的
心
不像
明镜
不可以
美丑
善恶
全部
包容
x1
x2
xn
linear ℎ 𝑥𝑖, 𝑤 = 𝑤𝑇𝑥𝑖 = 𝑦𝑖
Objective function: mean squared error (MSE)
21
 Learning Algorithm: stochastic gradient descent (SGD)
 Learning the parameters of the model with labeled data
 Word vectors
 Convolution filters weights
 Linear regression weights
 Labeled data
 Chinese: CVAT dataset
 English: VADER dataset
Dataset size #word L Dims
CVAT 720 21094 192.1 V+A
Tweets 4000 15284 13.62 V
Movie 10605 29864 18.86 V
Amazon 3708 8555 17.3 V
NYT 5190 20941 17.48 V
22
 All the dataset is separated into training set, validation set and test set for model training,
hyper-parameters selection and model evaluation.
 Evaluation Metrics
 MSE, Mean Square Error
 MAE, Mean Absolute Error
 Pearson’s correlation coefficient r
%2
1
1
( )
n
i i
i
MSE y y
n 
 
%
1
1
| |
n
i i
i
MAE y y
n 
 
% %
% %
1 1 1
2 2
1 1 1 1
1 1
( )( )
1 1
( ) ( )
n n n
i ii j
i j j
n n n n
i ii j
i j i j
y y y y
n n
r
y y y y
n n
  
   
 

  
  
   
23
Methods CNN wGW RMAR LCEL RMV
Metrics MSE MAE r MSE MAE r MSE MAE r MSE MAE r MSE MAE r
valence ratings prediction
CVAT 1.17 0.88 0.73 2.30 1.23 0.62 1.89 1.14 0.63 1.81 0.95 0.66 1.49 0.98 0.72
Tweets 1.00 0.76 0.79 2.54 1.25 0.65 1.30 0.89 0.69 1.25 0.85 0.75 1.18 0.86 0.74
Movie 2.14 1.18 0.67 6.46 2.02 0.17 3.54 1.73 0.16 2.54 1.36 0.42 2.25 1.26 0.62
Amazon 1.50 0.95 0.67 3.75 1.51 0.35 2.66 1.38 0.27 1.45 1.14 0.45 2.20 1.19 0.56
NYT 0.84 0.72 0.36 3.47 1.54 0.28 0.79 0.71 0.26 0.83 0.75 0.37 0.61 0.63 0.60
arousal ratings prediction
CVAT 0.98 0.81 0.64 1.34 0.94 0.31 1.20 0.89 0.35 1.07 0.91 0.62 0.98 0.79 0.53
CNN method improved the VA prediction experiment performance compared with the lexicon-based and
RMV method.
Baseline method:
• wGW, Weighted geometric mean method
• RMAR, Regression on mean affective ratings
• LCEL, linear combination using expanded lexicon
• RMV, regression on mean vectors method
24
 Using Transfer Learning Techniques to improve VA prediction performance.
 Motivation: There are numerous dataset for sentiment classification but only a few dataset
for VA prediction. The sentiment polarity maybe useful for VA prediction.
 Method: Pre-training the classification-CNN model, and then use the parameters of the pre-
trained networks as the initial value of VA prediction-CNN model, keep training on VA corpus.
25
何云超 yunchaohe@gmail.com
Thank you
26

More Related Content

What's hot

Power Analysis Attacks
Power Analysis AttacksPower Analysis Attacks
Power Analysis Attacks
Lee Stewart
 
Utility and game theory for schoolbook
Utility and game theory for schoolbookUtility and game theory for schoolbook
Utility and game theory for schoolbook
esbunag
 

What's hot (14)

Power Analysis Attacks
Power Analysis AttacksPower Analysis Attacks
Power Analysis Attacks
 
Sentiment analysis
Sentiment analysisSentiment analysis
Sentiment analysis
 
Bachelor Thesis
Bachelor ThesisBachelor Thesis
Bachelor Thesis
 
Game theory
Game theoryGame theory
Game theory
 
Utility and game theory for schoolbook
Utility and game theory for schoolbookUtility and game theory for schoolbook
Utility and game theory for schoolbook
 
Quantitative Methods for Lawyers - Class #6 - Basic Statistics + Probability ...
Quantitative Methods for Lawyers - Class #6 - Basic Statistics + Probability ...Quantitative Methods for Lawyers - Class #6 - Basic Statistics + Probability ...
Quantitative Methods for Lawyers - Class #6 - Basic Statistics + Probability ...
 
Quantum-Like Bayesian Networks using Feynman's Path Diagram Rules
Quantum-Like Bayesian Networks using Feynman's Path Diagram RulesQuantum-Like Bayesian Networks using Feynman's Path Diagram Rules
Quantum-Like Bayesian Networks using Feynman's Path Diagram Rules
 
Applications of game theory on event management
Applications of game theory on event management Applications of game theory on event management
Applications of game theory on event management
 
Game theory
Game theoryGame theory
Game theory
 
Game theory
Game theoryGame theory
Game theory
 
Crime Analysis using Regression and ANOVA
Crime Analysis using Regression and ANOVACrime Analysis using Regression and ANOVA
Crime Analysis using Regression and ANOVA
 
FLIPKART SAMSUNG
FLIPKART SAMSUNGFLIPKART SAMSUNG
FLIPKART SAMSUNG
 
The Relation Between Acausality and Interference in Quantum-Like Bayesian Net...
The Relation Between Acausality and Interference in Quantum-Like Bayesian Net...The Relation Between Acausality and Interference in Quantum-Like Bayesian Net...
The Relation Between Acausality and Interference in Quantum-Like Bayesian Net...
 
Chapter 6 part1- Introduction to Inference-Estimating with Confidence (Introd...
Chapter 6 part1- Introduction to Inference-Estimating with Confidence (Introd...Chapter 6 part1- Introduction to Inference-Estimating with Confidence (Introd...
Chapter 6 part1- Introduction to Inference-Estimating with Confidence (Introd...
 

Similar to Continuous Sentiment Intensity Prediction based on Deep Learning

Computational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding RegionsComputational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding Regions
butest
 
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Simplilearn
 
Other classification methods in data mining
Other classification methods in data miningOther classification methods in data mining
Other classification methods in data mining
Kumar Deepak
 

Similar to Continuous Sentiment Intensity Prediction based on Deep Learning (20)

SAC TRECK 2008
SAC TRECK 2008SAC TRECK 2008
SAC TRECK 2008
 
韩国会议
韩国会议韩国会议
韩国会议
 
Computational Finance Introductory Lecture
Computational Finance Introductory LectureComputational Finance Introductory Lecture
Computational Finance Introductory Lecture
 
Computational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding RegionsComputational Biology, Part 4 Protein Coding Regions
Computational Biology, Part 4 Protein Coding Regions
 
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 2 | Machine Learning Tutorial For Beginners ...
 
Textual & Sentiment Analysis of Movie Reviews
Textual & Sentiment Analysis of Movie ReviewsTextual & Sentiment Analysis of Movie Reviews
Textual & Sentiment Analysis of Movie Reviews
 
[Revised] Intro to CNN
[Revised] Intro to CNN[Revised] Intro to CNN
[Revised] Intro to CNN
 
Methodological Study Of Opinion Mining And Sentiment Analysis Techniques
Methodological Study Of Opinion Mining And Sentiment Analysis Techniques  Methodological Study Of Opinion Mining And Sentiment Analysis Techniques
Methodological Study Of Opinion Mining And Sentiment Analysis Techniques
 
Methodological study of opinion mining and sentiment analysis techniques
Methodological study of opinion mining and sentiment analysis techniquesMethodological study of opinion mining and sentiment analysis techniques
Methodological study of opinion mining and sentiment analysis techniques
 
Regressioin mini case
Regressioin mini caseRegressioin mini case
Regressioin mini case
 
Other classification methods in data mining
Other classification methods in data miningOther classification methods in data mining
Other classification methods in data mining
 
Kinetic bands versus Bollinger Bands
Kinetic bands versus Bollinger  BandsKinetic bands versus Bollinger  Bands
Kinetic bands versus Bollinger Bands
 
Telefonica Lunch Seminar
Telefonica Lunch SeminarTelefonica Lunch Seminar
Telefonica Lunch Seminar
 
Movie Sentiment Analysis using Deep Learning RNN
Movie Sentiment Analysis using Deep Learning RNNMovie Sentiment Analysis using Deep Learning RNN
Movie Sentiment Analysis using Deep Learning RNN
 
report
reportreport
report
 
SVM & KNN Presentation.pptx
SVM & KNN Presentation.pptxSVM & KNN Presentation.pptx
SVM & KNN Presentation.pptx
 
Predicting Facial Expression using Neural Network
Predicting Facial Expression using Neural Network Predicting Facial Expression using Neural Network
Predicting Facial Expression using Neural Network
 
Facial Expression Recognition
Facial Expression RecognitionFacial Expression Recognition
Facial Expression Recognition
 
Facial Expression Recognition via Python
Facial Expression Recognition via PythonFacial Expression Recognition via Python
Facial Expression Recognition via Python
 
2012 predictive clusters
2012 predictive clusters2012 predictive clusters
2012 predictive clusters
 

Recently uploaded

一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
pyhepag
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
cyebo
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理
pyhepag
 
一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理
pyhepag
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Riyadh +966572737505 get cytotec
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
pyhepag
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
cyebo
 

Recently uploaded (20)

Machine Learning for Accident Severity Prediction
Machine Learning for Accident Severity PredictionMachine Learning for Accident Severity Prediction
Machine Learning for Accident Severity Prediction
 
basics of data science with application areas.pdf
basics of data science with application areas.pdfbasics of data science with application areas.pdf
basics of data science with application areas.pdf
 
Slip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp ClaimsSlip-and-fall Injuries: Top Workers' Comp Claims
Slip-and-fall Injuries: Top Workers' Comp Claims
 
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPsWebinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
Webinar One View, Multiple Systems No-Code Integration of Salesforce and ERPs
 
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
一比一原版加利福尼亚大学尔湾分校毕业证成绩单如何办理
 
2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting2024 Q2 Orange County (CA) Tableau User Group Meeting
2024 Q2 Orange County (CA) Tableau User Group Meeting
 
How I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prisonHow I opened a fake bank account and didn't go to prison
How I opened a fake bank account and didn't go to prison
 
Data Visualization Exploring and Explaining with Data 1st Edition by Camm sol...
Data Visualization Exploring and Explaining with Data 1st Edition by Camm sol...Data Visualization Exploring and Explaining with Data 1st Edition by Camm sol...
Data Visualization Exploring and Explaining with Data 1st Edition by Camm sol...
 
一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理一比一原版纽卡斯尔大学毕业证成绩单如何办理
一比一原版纽卡斯尔大学毕业证成绩单如何办理
 
AI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdfAI Imagen for data-storytelling Infographics.pdf
AI Imagen for data-storytelling Infographics.pdf
 
社内勉強会資料  Mamba - A new era or ephemeral
社内勉強会資料   Mamba - A new era or ephemeral社内勉強会資料   Mamba - A new era or ephemeral
社内勉強会資料  Mamba - A new era or ephemeral
 
Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)Atlantic Grupa Case Study (Mintec Data AI)
Atlantic Grupa Case Study (Mintec Data AI)
 
Easy and simple project file on mp online
Easy and simple project file on mp onlineEasy and simple project file on mp online
Easy and simple project file on mp online
 
一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理一比一原版阿德莱德大学毕业证成绩单如何办理
一比一原版阿德莱德大学毕业证成绩单如何办理
 
一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理一比一原版西悉尼大学毕业证成绩单如何办理
一比一原版西悉尼大学毕业证成绩单如何办理
 
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotecAbortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
Abortion pills in Dammam Saudi Arabia// +966572737505 // buy cytotec
 
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
一比一原版(Monash毕业证书)莫纳什大学毕业证成绩单如何办理
 
一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理一比一原版麦考瑞大学毕业证成绩单如何办理
一比一原版麦考瑞大学毕业证成绩单如何办理
 
Pre-ProductionImproveddsfjgndflghtgg.pptx
Pre-ProductionImproveddsfjgndflghtgg.pptxPre-ProductionImproveddsfjgndflghtgg.pptx
Pre-ProductionImproveddsfjgndflghtgg.pptx
 
Supply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflictSupply chain analytics to combat the effects of Ukraine-Russia-conflict
Supply chain analytics to combat the effects of Ukraine-Russia-conflict
 

Continuous Sentiment Intensity Prediction based on Deep Learning

  • 1. Yunchao He (何云超) 2015.9.15 @ Yuan Ze University
  • 2.  “unbelievably disappointing”  “Full of zany characters and richly applied satire, and some great plot twists”  “this is the greatest screwball comedy ever filmed”  “It was pathetic. The worst part about it was the boxing scenes.”  Sentiment Analysis  Using NLP, statistics, or machine learning methods to extract, identify, or otherwise characterize the sentiment content of a text unit  Sometimes called opinion mining, although the emphasis in this case is on extraction  Other names: Opinion extraction、Sentiment mining、Subjectivity analysis 2
  • 3. 3
  • 4.  Movie: is this review positive or negative?  Products: what do people think about the new iPhone?  Public sentiment: how is consumer confidence? Is despair increasing?  Politics: what do people think about this candidate or issue?  Prediction: predict election outcomes or market trends from sentiment 4
  • 5.  Short text classification based on Semantic clustering  Sentiment intensity prediction using CNN  Transfer Learning* * Future works 5
  • 6.  People express opinions in complex ways  In opinion texts, lexical content alone can be misleading  Intra-textual and sub-sentential reversals, negation, topic change common  Rhetorical devices such as sarcasm, irony, implication, etc. 6
  • 7.
  • 8.  Tokenization  Feature Extraction: n-grams, semantics, syntactic, etc.  Classification using different classifiers  Naïve Bayes  MaxEnt  SVM  Drawback  Feature Sparsity S1: I really like this movie [...0 0 1 1 1 1 1 0 0 ... ] 8 S1: This phone has a good keypad S2: He will move and leave her for good
  • 9.  Using clustering algorithm to aggregate short text to form big clusters, in which each cluster has the same topic and the same sentiment polarity, to reduce the sparsity of short text representation and keep interpretation. S1: it works perfectly! Love this product S2: very pleased! Super easy to, I love it S3: I recommend it it works perfectly love this product very pleased super easy to I recommend S1: [1 1 1 1 1 1 0 0 0 0 0 0 0] S2: [0 0 0 1 0 0 1 1 1 1 1 1 0] S3: [1 0 0 0 0 0 0 0 0 0 0 1 1] S1+S2+S3: [...0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0...] 9
  • 10.  Training data labeled with positive and negative polarity  K-means clustering algorithm is used to cluster positive and negative text separately.  K-means, KNN, LDA… works perfectly! Love this product completely useless, return policy very pleased! Super easy to, I am pleased was very poor, it has failed highly recommend it, high recommended! it totally unacceptable, is so bad works perfectly! Love this product very pleased! Super easy to, I am pleased highly recommend it, high recommended! completely useless, return policy was very poor, it has failed it totally unacceptable, is so bad Topical clusters 10
  • 11. Classifier: Multinomial Naive Bayes Probabilistic classifier: get the probability of label given a clustered text , 1 arg max ( | ) arg max ( ) ( | ) Ci i s S i j s S j N s P s C P s P C s        $ ( ) sN P s N  , , ( , ) 1 ( | ) ( | ) | | i j i j x V N C s P C s N x s V     Bayes’ theory Independent assumption 11
  • 12.  Given an unlabeled text , we use Euclidean distance to find the most similar positive cluster , and the most similar negative cluster  The sentiment of , is estimated according to the probabilistic change of the two clusters when merging with . (vs. KNN)  This merging operation is called two-stage-merging method, as each unlabeled text will be merged two times. 0, | ( ) ( ) | | ( ) ( ) | ( ) 1, . m m n n j P NC P C P NC P C f x otherwise            mC  jx nC  jx jx 12
  • 13.  Dataset: Stanford Twitter Sentiment Corpus (STS)  Baseline: bag-of-unigrams and bigrams without clustering  Evaluation Metrics: accuracy, precision, recall  The average precision and accuracy is 1.7% and 1.3% higher than the baseline method. Methods Accuracy Precision Recall Our Method 0.816 0.82 0.813 Bigrams 0.805 0.807 0.802 13
  • 14.
  • 15.  Continuous sentiment intensity provides fine-grained representation of sentiment.  Representing sentiment as Valence-arousal can easily convert to discrete categories. “unbelievably disappointing” Model V: -0.5 A: 0.3 15
  • 16.  Lexicon based Method. To find the relationship between word-level and sentence-level sentiment values. Word-level information comes from sentiment lexicon, e.g. ANEW.  Paltoglou 2013: Weighted Arithmetic Mean、Weighted Geometric Mean  Malandrakis 2013: linear regression Paltoglou, G., Theunis, M., Kappas, A., & Thelwall, M. (2013). Predicting emotional responses to long informal text. Affective Computing, IEEE Transactions on, 4(1), 106-115. Malandrakis, N., Potamianos, A., Iosif, E., & Narayanan, S. (2013). Distributional semantic models for affective text analysis. Audio, Speech, and Language Processing, IEEE Transactions on, 21(11), 2379-2392. 16
  • 17.  To find the relationship between words and sentence-level sentiment. CNN Method Lexicon-based Methods Word Dense vector VA value Relationship Auto learned Manually specified Training data Many Few or None Word Order Considered* Not Considered Interpretation Black Box Easy 17
  • 18.  To find the relationship between words and sentence-level sentiment.  Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression  Word Representation: dense vector, distributed representation 我们的 心 不像 明镜 不可以 美丑 善恶 全部 包容 boat ship vessel good happy Beijing Shanghai glad Semantic information of word is encoded in the dense vector. 18
  • 19.  Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression 我们的 心 不像 明镜 不可以 美丑 善恶 全部 包容Sentence Matrix [ : 1,:]( )i i i mc f w S b     Dimension Reduced  Reduce the parameters of the model  Parameter sharing f: Activation function, Relu, tanh, sigmoid, … 𝑓 𝑥 = max(0, 𝑥) 19
  • 20.  Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression  Aggregate the information and capture the most important features 我们的 心 不像 明镜 不可以 美丑 善恶 全部 包容 3 6 79 7 54 79 9 Max pool with 5×1 filters and stride 1 20
  • 21.  Sentence Matrix -> Convolution Operator -> Max Pooling -> Regression 我们的 心 不像 明镜 不可以 美丑 善恶 全部 包容 x1 x2 xn linear ℎ 𝑥𝑖, 𝑤 = 𝑤𝑇𝑥𝑖 = 𝑦𝑖 Objective function: mean squared error (MSE) 21
  • 22.  Learning Algorithm: stochastic gradient descent (SGD)  Learning the parameters of the model with labeled data  Word vectors  Convolution filters weights  Linear regression weights  Labeled data  Chinese: CVAT dataset  English: VADER dataset Dataset size #word L Dims CVAT 720 21094 192.1 V+A Tweets 4000 15284 13.62 V Movie 10605 29864 18.86 V Amazon 3708 8555 17.3 V NYT 5190 20941 17.48 V 22
  • 23.  All the dataset is separated into training set, validation set and test set for model training, hyper-parameters selection and model evaluation.  Evaluation Metrics  MSE, Mean Square Error  MAE, Mean Absolute Error  Pearson’s correlation coefficient r %2 1 1 ( ) n i i i MSE y y n    % 1 1 | | n i i i MAE y y n    % % % % 1 1 1 2 2 1 1 1 1 1 1 ( )( ) 1 1 ( ) ( ) n n n i ii j i j j n n n n i ii j i j i j y y y y n n r y y y y n n                     23
  • 24. Methods CNN wGW RMAR LCEL RMV Metrics MSE MAE r MSE MAE r MSE MAE r MSE MAE r MSE MAE r valence ratings prediction CVAT 1.17 0.88 0.73 2.30 1.23 0.62 1.89 1.14 0.63 1.81 0.95 0.66 1.49 0.98 0.72 Tweets 1.00 0.76 0.79 2.54 1.25 0.65 1.30 0.89 0.69 1.25 0.85 0.75 1.18 0.86 0.74 Movie 2.14 1.18 0.67 6.46 2.02 0.17 3.54 1.73 0.16 2.54 1.36 0.42 2.25 1.26 0.62 Amazon 1.50 0.95 0.67 3.75 1.51 0.35 2.66 1.38 0.27 1.45 1.14 0.45 2.20 1.19 0.56 NYT 0.84 0.72 0.36 3.47 1.54 0.28 0.79 0.71 0.26 0.83 0.75 0.37 0.61 0.63 0.60 arousal ratings prediction CVAT 0.98 0.81 0.64 1.34 0.94 0.31 1.20 0.89 0.35 1.07 0.91 0.62 0.98 0.79 0.53 CNN method improved the VA prediction experiment performance compared with the lexicon-based and RMV method. Baseline method: • wGW, Weighted geometric mean method • RMAR, Regression on mean affective ratings • LCEL, linear combination using expanded lexicon • RMV, regression on mean vectors method 24
  • 25.  Using Transfer Learning Techniques to improve VA prediction performance.  Motivation: There are numerous dataset for sentiment classification but only a few dataset for VA prediction. The sentiment polarity maybe useful for VA prediction.  Method: Pre-training the classification-CNN model, and then use the parameters of the pre- trained networks as the initial value of VA prediction-CNN model, keep training on VA corpus. 25