Browse > Article
http://dx.doi.org/10.6109/jkiice.2022.26.5.662

Semantic Role Labeling using Biaffine Average Attention Model  

Nam, Chung-Hyeon (Department of Computer Engineering, Korea University of Technology and Education)
Jang, Kyung-Sik (Department of Computer Engineering, Korea University of Technology and Education)
Abstract
Semantic role labeling task(SRL) is to extract predicate and arguments such as agent, patient, place, time. In the previously SRL task studies, a pipeline method extracting linguistic features of sentence has been proposed, but in this method, errors of each extraction work in the pipeline affect semantic role labeling performance. Therefore, methods using End-to-End neural network model have recently been proposed. In this paper, we propose a neural network model using the Biaffine Average Attention model for SRL task. The proposed model consists of a structure that can focus on the entire sentence information regardless of the distance between the predicate in the sentence and the arguments, instead of LSTM model that uses the surrounding information for prediction of a specific token proposed in the previous studies. For evaluation, we used F1 scores to compare two models based BERT model that proposed in existing studies using F1 scores, and found that 76.21% performance was higher than comparison models.
Keywords
Deep Learning; Natural Language Processing; Semantic Role Labeling;
Citations & Related Records
연도 인용수 순위
  • Reference
1 J. S. Bae and C. K. Lee, "Korean Semantic Role Labeling using Stacked Bidirectional LSTM-CRFs," Journal of KIISE, vol. 44, no. 1, pp. 36-43, Jan. 2017.   DOI
2 K. H. Park and S. H. Na, "A Neural Attention model for Korean Semantic Role Labeling," in Proceeding of the 2017 Korea Software Congress, Busan, South Korea, pp. 512-514, 2019.
3 T. Dozat, C. D. Manning, "A Neural Attention model for Korean Semantic Role Labeling," in Proceeding of the 5th International Conference on Learning Representations, Busan, South Korea, pp. 512-514, 2019.
4 S. H. Na, J. R. Li, J. H. Shin and K. I. Kim, "Deep Biaffine Attention for Korean Dependency Parsing," in Proceeding of the 2017 Korea Computer Congress, Jeju, South Korea, pp. 584-586, 2017.
5 J. Yu, B. Bohnet and M. Poesio, "Named Entity Recognition as Dependency Parsing," in Proceeding of the 58th Annual Meeting of the Association for Computational Linguistics, Kuala Lumpur, Malaysia, pp. 6470-6476, 2020.
6 AI Hub Common sense AI dataset [Internet]. Available: https://www.aihub.or.kr/.
7 J. S. Bae, C. K. Lee, S. J. Lim and H. K. Kim, "Korean Semantic Role Labeling with BERT," in Proceeding of the 2019 Korea Computer Congress, Jeju, South Korea, pp. 512-514, 2019.
8 L. He, K. Lee, M. Lewis, and L. Zettlemoyer, "Deep Semantic Role Labeling: What Works and What's Next," in Proceeding of the 55th Annual Meeting of the Association for Computational Linguistics, Vancouver, Canada, pp. 473-483, 2017.
9 L. He, K. Lee, O. Levy, and L. Zettlemoyer, "Jointly Predicting Predicates and Arguments in Neural Semantic Role Labeling," in Proceeding of the 56th Annual Meeting of the Association for Computational Linguistics, Melbourne, Australia, pp. 364-369, 2018.
10 J. Devlin, M. W. Chang, K. Lee, K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," in Proceeding of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, US, pp. 4171-4186, 2019.
11 T. Y. Lin, P. Goyal, K. He and P. Dollar, "Focal Loss for Dense Object Detection," in Proceeding of the 2017 IEEE International Conference on Computer Vision(ICCV), Venezia, Italiana, pp. 2980-2988, 2017.