과제정보
This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2017S1A6A3A01078538).
참고문헌
- N. Kotonya and F. Toni, "Explainable Automated FactChecking: A Survey," Proceedings of the 28th International Conference on Computational Linguistics, Barcelona (online), pp. 5430-5443, 2020.
- L. Wu, Y. Rao, Y. Zhao, H. Liang, and A. Nazir, "DTCA: Decision Tree-based Co-Attention Networks for Explainable Claim Verification," in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, online, pp. 1024-1035, 2020.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, "Attention Is All You Need," in Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, pp. 6000-6010, 2017.
- J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics, Minneapolis: MN, pp. 4171-4186, 2019.
- Wikipedia, Locality Sensitive Hashing [Internet]. Available: https://en.wikipedia.org/wiki/Locality-sensitive_hashing.
- N. Kitaev, L. Kaiser, and A. Levskaya, "Reformer: The Efficient Transformer," in International Conference on Learning Representations, online, 2020.
- E. Kochkina, M. Liakata, and A. Zubiaga. "All-in-one: Multi-task Learning for Rumour Verification," Proceedings of the 27th International Conference on Computational Linguistics, Santa Fe: NM, pp. 3402-3413, 2018.