这是近期对Sentiment Classification重要论文的初步调研，只涉及了几篇论文，总结的都是比较基础通用的方法，主要是基于Pang Bo的相关研究工作总结，下面是自己总结的大纲，是英文版的。
• Produce a list of sentiment words byintrospection and rely on them alone to classify the texts
• Algorithm used in Minghui’s paperto predict the polarity of user interaction
• Combine three popular lexicons to get an English lexicon
• Check whether there are more positivesentiment words or more negative sentiment words in the expression
• Naïve Bayes
• Maximum Entropy Model
• Support Vector Machines
• Need labeled data
• To assign to a given Document d the Class C*
• Bayes’ rule
• NB Classifier
• Simple but have high predictive power
• Estimate of P(c|d) takes the following exponential form
• Fi,cis afeature/class functionforfeature fi and class c, defined as follows
• Toolkit: Zhang Le's (2004) Package Maximum EntropyModeling Toolkit for Python and C++
• Given a set of training data, the SVMclassifier finds the hyper plane such that each training point is correctlyclassified and the hyper plane is as far as possible from the points closest toit
• Toolkit: SVMlight,LibSVM, PyML
• Unigrams vs. Bigrams vs. Both
• Feature frequency vs. Presence
• POS tags
• Position of words
• Negation words
• Adjectives and verbs
• Combining Bayes, MaxEnt, and SVMclassifiers over the same data provided a three to four percents boost over thebest of the inpidual classifiers alone.
Integratinga Sentence Classifier with Language model
• Run on each sentence of the review toobtain a decision of “positive” or “negative”.
• Then the sentences are used to “vote” thereview as negative or positive on the basis of their probability scores.
• After that the decisions of voting fromthe sentence classifier and the review classifier are combined together usingoptimum weights.
PerformanceImprovement with Information from Sentence Level
The Sentence Classifier
• Also based on Naïve Bayes Algorithm
• Run on the 5331 negative and 5331 positive sentences
Vote toClassify Reviews
• The majority vote is used to classify thereview.
• Using the actual scores of positivity andnegativity computed for the sentence gets a considerable 1.9% improvement inperformance.
WeightingSentences by Positions
• Weighting the sentence by their positionin the review .
• Providing more weights to sentences thatare towards the beginning and end of a review.
Incorporate“Focus” of a Sentence
• “Focus” here means whether a givensentence is talking about a movie or not.
• Two approaches:
– Check if asentence contains words such as this movie, the plot, the actors, etc.
– Check if a sentence consists of a movie name.
• Using classification of sentences inaddition to the Naïve Bayes model increased accuracy of the system by 5.5% .
• Use other algorithms like MaxEnt, or SVMfor the sentence classifier.
• Divide the review into paragraphs andprovide different weights to different paragraphs according to positions.
• 1. Bo Pang, Lillian Lee, and ShivakumarVaithyanathan. 2002. Thumbs up? sentiment classification using machine learningtechniques. (Proceedings of EMNLP).
• 2. Soo-Min Kim etal. 2004. Determining the Sentiment ofOpinions. (Proceedings of COLING).
• 3. Sunil Khanal . 2010. Sentiment Classification using Language Models andSentence Position Information . (Stanford CS 224N final project)
• 4. Michel Galley etal.2004.Identifying Agreement and Disagreement in Conversational Speech: Use ofBayesian Networks to Model Pragmatic Dependencies. (Proceedings of ACL)