Browse > Article
http://dx.doi.org/10.7236/IJIBC.2021.13.1.100

Developing Sentimental Analysis System Based on Various Optimizer  

Eom, Seong Hoon (Department of Electrical and Electronic Engineering, Youngsan University)
Publication Information
International Journal of Internet, Broadcasting and Communication / v.13, no.1, 2021 , pp. 100-106 More about this Journal
Abstract
Over the past few decades, natural language processing research has not made much. However, the widespread use of deep learning and neural networks attracted attention for the application of neural networks in natural language processing. Sentiment analysis is one of the challenges of natural language processing. Emotions are things that a person thinks and feels. Therefore, sentiment analysis should be able to analyze the person's attitude, opinions, and inclinations in text or actual text. In the case of emotion analysis, it is a priority to simply classify two emotions: positive and negative. In this paper we propose the deep learning based sentimental analysis system according to various optimizer that is SGD, ADAM and RMSProp. Through experimental result RMSprop optimizer shows the best performance compared to others on IMDB data set. Future work is to find more best hyper parameter for sentimental analysis system.
Keywords
Sentimental analysis; Natural language processing; Optimizer; Word embedding;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Hochreiter, S. and J. Schmidhuber. "Long Short-Term Memory," Neural Computation 9: 1735-1780, 1997.   DOI
2 LSTM Figure (source: https://upload.wikimedia.org/wikipedia/commons/9/98/LSTM.png)
3 Olah, Chris, and Shan Carter. "Attention and augmented recurrent neural networks." Distill 1.9: e1, 2016.
4 http://www.comp.hkbu.edu.hk/-markus/teaching/comp7650/tnn-94-gradient.pdf
5 Andrew L. Maas, Raymond E. Daly, Dan Huang, Andrew Y. Ng, and Christopher Potts. Learning Word Vectors for Sentiment Analysis. The 49th Annual Meeting of the Association for Computational Linguistics (ACL), 2011.
6 Hashemi, M. Enlarging smaller images before inputting into convolutional neural network: zero-padding vs. interpolation. J Big Data 6, 98, 2019. DOI: https://doi.org/10.1186/s40537-019-0263-7   DOI
7 Kusner, Matt, et al. "From word embeddings to document distances," International conference on machine learning. 2015.
8 one-hot encoding: RODRIGUEZ, Pau, et al. Beyond one-hot encoding: Lower dimensional target embedding. Image and Vision Computing, 75: 21-31, 2018.   DOI
9 Farzad, Amir, Hoda Mashayekhi, and Hamid Hassanpour. "A comparative performance analysis of different activation functions in LSTM networks for classification," Neural Computing and Applications 31.7: 2507-2521, 2019.   DOI
10 Jiang, Siyu, and Yimin Chen. "Hand gesture recognition by using 3DCNN and LSTM with adam optimizer," Pacific Rim Conference on Multimedia. Springer, Cham, 2017.
11 Bottou, Leon. "Stochastic gradient descent tricks." Neural networks: Tricks of the trade. Springer, Berlin, Heidelberg, 421-436, 2012.
12 Kurbiel, Thomas, and Shahrzad Khaleghian. "Training of deep neural networks based on distance measures using RMSProp," arXiv preprint arXiv:1708.01911, 2017.
13 Kingma, Diederik P., and Jimmy Ba. "Adam: A method for stochastic optimization," arXiv preprint arXiv:1412.6980, 2014.
14 ARPIT, Devansh, et al. h-detach: Modifying the lstm gradient towards better optimization. arXiv preprint arXiv:1810.03023, 2018.
15 Stehman, Stephen V. "Selecting and interpreting measures of thematic classification accuracy," Remote Sensing of Environment. 62 (1): 77-89. Bibcode:1997RSEnv..62...77S, 1997. DOI: https://doi.org/10.1016/S0034-4257(97)00083-7   DOI