DOI QR코드

DOI QR Code

A Discourse-based Compositional Approach to Overcome Drawbacks of Sequence-based Composition in Text Modeling via Neural Networks

신경망 기반 텍스트 모델링에 있어 순차적 결합 방법의 한계점과 이를 극복하기 위한 담화 기반의 결합 방법

  • Received : 2017.09.13
  • Accepted : 2017.11.01
  • Published : 2017.12.15

Abstract

Since the introduction of Deep Neural Networks to the Natural Language Processing field, two major approaches have been considered for modeling text. One method involved learning embeddings, i.e. the distributed representations containing abstract semantics of words or sentences, with the textual context. The other strategy consisted of composing the embeddings trained by the above to get embeddings of longer texts. However, most studies of the composition methods just adopt word embeddings without consideration of the optimal embedding unit and the optimal method of composition. In this paper, we conducted experiments to analyze the optimal embedding unit and the optimal composition method for modeling longer texts, such as documents. In addition, we suggest a new discourse-based composition to overcome the limitation of the sequential composition method on composing sentence embeddings.

자연 언어 처리(Natural Language Processing) 분야에 심층 신경망(Deep Neural Network)이 소개된 이후, 단어, 문장 등의 의미를 나타내기 위한 분산 표상인 임베딩(Embedding)을 학습하기 위한 연구가 활발히 진행되고 있다. 임베딩 학습을 위한 방법으로는 크게 문맥 기반의 텍스트 모델링 방법과, 기학습된 임베딩을 결합하여 더 긴 텍스트의 분산 표상을 계산하고자 하는 결합 기반의 텍스트 모델링 방법이 있다. 하지만, 기존 결합 기반의 텍스트 모델링 방법은 최적 결합 단위에 대한 고찰 없이 단어를 이용하여 연구되어 왔다. 본 연구에서는 비교 실험을 통해 문서 임베딩 생성에 적합한 결합 기법과 최적 결합 단위에 대해 알아본다. 또한, 새로운 결합 방법인 담화 분석 기반의 결합 방식을 제안하고 실험을 통해 기존의 순차적 결합 기반 신경망 모델 대비 우수성을 보인다.

Keywords

Acknowledgement

Grant : (엑소브레인-3세부) 컨텍스트 인지형 Deep-Symbolic 하이브리드 지능 원천 기술 개발 및 언어 지식 자원 구축

Supported by : 정보통신기술연구진흥센터

References

  1. T. Mikolov, K. Chen, G. Corrado and J. Dean, "Efficient Estimation of Word Representations in Vector Space," arXiv preprint arXiv:1301.3781. 2013.
  2. Q.V. Le and T. Mikolov, "Distributed Representations of Sentences and Documents," Proc. of the International Conference on Machine Learning, pp. 1188- 1196, 2014.
  3. C. Goller and A. Kuchler, "Learning task-dependent distributed representations by backpropagation through structure," Proc. of the International Conference on Neural Networks, pp. 347-352, 1996.
  4. S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural computation, Vol. 9, pp. 1735- 1780, 1997. https://doi.org/10.1162/neco.1997.9.8.1735
  5. B. Pang and L. Lee, "A sentimental education: Sentiment analysis using subjectivity summarization based on minimum cuts," Proc. of the Annual Meeting of the Association for Computational Linguistics, pp. 271-278, 2004.
  6. A.L. Maas, R.E. Daly, P.T. Pham, D. Huang, A.Y. Ng and C. Potts, "Learning word vectors for sentiment analysis," Proc. of the Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 142-150, 2011.
  7. E. Filatova, "Irony and sarcasm: Corpus generation and analysis using crowdsourcing," Proc. of the 8th International Conference on Language Resources and Evaluation (LREC), pp. 392-398, 2012.
  8. W.C. Mann and S.A. Thompson, "Rhetorical structure theory: Toward a functional theory of text organization. Text-Interdisciplinary," Journal for the Study of Discourse, Vol. 8, pp. 243-281, 1988.
  9. P. Bhatia, Y. Ji and J. Eisenstein, "Better Document- level Sentiment Analysis from RST Discourse Parsing," Proc. of the Conference on Empirical Methods in Natural Language Processing, pp. 2212-2218, 2015.
  10. X. Fu, W. Liu, Y. Xu, C. Yu and T. Wang, "Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis," Proc. of Asian Conference on Machine Learning, pp. 17-32, 2016.
  11. Y. Ji and J. Eisenstein, "Representation Learning for Text-level Discourse Parsing," Proc. of the Annual Meeting of the Association for Computational Linguistics, pp. 13-24, 2014.