딥러닝 기반의 한글 문장 생성 기법

  • Published : 2022.09.30

Abstract

Keywords

References

  1. Barbieri, G., F. Pachet, P. Roy, M. Degli Esposti, "Markov constraints for generating lyrics with style," 20th European Conference on Artificial Intelligence, pp.115-120, 2012.
  2. Oliveira, H. G., R. Hervas, A. Diaz, P. Gervas, "Adapting a generic platform for poetry generation to produce spanish poems," 5th International Conference on Computational Creativity, pp.63-71, 2014.
  3. Addanki, K., D. Wu, "Unsupervised rhyme scheme identification in hip hop lyrics using hidden markov models," Statistical Language and Speech Processing, pp.39-50, 2013.
  4. Malmi, E., P. Takala, H. Toivonen, T. Raiko, A. Gionis, "DopeLearning: A computational approach to rap lyrics generation," arXiv preprint arXiv: 1505.04771, pp.195-204, 2015.
  5. Hochreiter, S., J. Schmidhuber, "Long short-term memory," Neural computation, Vol.9(8), pp.1735-1780, 1997. https://doi.org/10.1162/neco.1997.9.8.1735
  6. Chung, J., C. Gulcehre, K. Cho, Y. Bengio, "Empirical evaluation of gated recurrent neural networks on sequence modeling," arXiv preprint arXiv:1412.3555, pp.1-9, 2014.
  7. Zhang, X., M. Lapata, "Chinese poetry generation with recurrent neural networks," EMNLP 2014, pp.670-680, 2014.
  8. Potash, P., A. Romanov, A. Rumshisky, "GhostWriter: Using an LSTM for automatic rap lyric generation," EMNLP 2015, pp.1919-1924, 2015.
  9. Yan, R., "i, poet: automatic poetry composition through recurrent neural networks with iterative polishing schema," 25th International Joint Conference on Artificial Intelligence, pp.2238-2244, 2016.
  10. Bowman, S. R., L. Vilnis, O. Vinyals, A. M. Dai, R. Jozefowicz, S. Bengio, "Generating sentences from a continuous space," arXiv preprint arXiv:1511.06349, 2015.
  11. Radford, A., J. Wu, R. Child, D. Luan, D. Amodei, I. Sutskever, "Language models are unsupervised multitask learners," OpenAI Blog, Vol.1(8), 2019.
  12. Brown, T. B., B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal, S. Agarwal, et al. "Language models are few-shot learners," arXiv preprint arXiv:2005.14165, 2020.
  13. Sutskever, I., O. Vinyals, Q. V. Le, "Sequence to sequence learning with neural networks," In Advances in neural information processing systems, pp.3104-3112, 2014.
  14. Fan, H., J. Wang, B. Zhuang, S. Wang, J. Xiao, "A hierarchical attention based seq2seq model for Chinese lyrics generation," In Pacific Rim International Conference on Artificial Intelligence, pp.279-288, 2019.
  15. Egonmwan, E., Y. Chali, "Transformer and seq2seq model for paraphrase generation," In Proceedings of the 3rd Workshop on Neural Generation and Translation, pp.249-255, 2019.
  16. Zhang, Y., Y. Wang, J. Liao, W. Xiao, "A hierarchical attention seq2seq model with copynet for text summarization," In 2018 International Conference on Robots & Intelligent System(ICRIS), IEEE, pp.316-320, 2018.
  17. 최형준, 나승훈, "Delete-MASS Gen: MASS 를 이용한 단어 n-gram 삭제 및 생성 기반 한국어 스타일 변환", 한국정보과학회 학술발표논문집, pp. 1433-1435, 2019.
  18. Vaswani, A., N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, I. Polosukhin, et al. "Attention is all you need," In Advances in neural information processing systems, pp.5998-6008, 2017.
  19. Kingma, D. P., M. Welling, "Auto-encoding variational Bayes," arXiv preprint arXiv: 1312.6114, pp.1-14, 2013.
  20. Miao, Y., L. Yu, P. Blunsom, "Neural variational inference for text processing," In International conference on machine learning, pp.1727-1736, 2016.
  21. Wang, W., Z. Gan, H. Xu, R. Zhang, G. Wang, D. Shen, L. Carin, et al. "Topic-Guided variational autoencoders for text generation," arXiv preprint arXiv: 1903.07137, 2019.
  22. Weston, J., S. Chopra, A. Bordes, "Memory networks," arXiv preprint arXiv:1410.3916, pp.1-15, 2014.
  23. Sukhbaatar, S., J. Weston, R. Fergus, "End-to-end memory networks," In Advances in neural information processing systems, pp.2440-2448, 2015.
  24. Lin, Z., X. Huang, F. Ji, H. Chen, Y. Zhang, "Task-Oriented conversation generation using heterogeneous memory net- works," arXiv preprint arXiv:1909.11287, 2019.
  25. Vinyals, O., M. Fortunato, N. Jaitly, "Pointer networks," In Advances in neural information processing systems, pp.2692-2700, 2015.
  26. See, A., P. J. Liu, C. D. Manning, "Get to the point: Summarization with pointer-generator networks," arXiv preprint arXiv:1704.04368, 2017.
  27. Devlin, J., M. W. Chang, K. Lee, K. Toutanova, "BERT: Pre-training of deep bidirectional transformers for language understanding," arXiv preprint arXiv:1810.04805, 2018.
  28. 이주성, 오연택, 변현진, 민경구, "BERT를 이용한 한국어 문장의 스타일 변화", 제31회 HCLT, pp.395-399, 2019.
  29. Raffel, C., N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, P. J. Liu, et al. "Exploring the limits of transfer learning with a unified text-to-text transformer," arXiv preprint arXiv:1910.10683, 2019.
  30. Habert, B., G. Adda, M. Adda-Decker, P. B. de Mareuil, S. Ferrari, O. Ferret, P. Paroubek, et al. "Towards tokenization evaluation," In Proceedings of Conference on Language Resources and Evaluation, Vol.98, pp.427-431, 1998.
  31. Kudo, T., J. Richardson, "SentencePiece: A simple and language independent subword tokenizer and detokenizer for neural text processing," arXiv preprint arXiv:1808.06226, 2018.
  32. Bahdanau, D., K. Cho, Y. Bengio, "Neural machine translation by jointly learning to align and translate," arXiv preprint arXiv:1409.0473, pp.1-15, 2014.
  33. 손성환, GPT-2 모델을 이용한 카테고리별 텍스트 생성, 국민대학교 석사학위 논문, 2020.