References
- J. Chen and H. Zhuge, Abstractive text-image summarization using multi-modal attentional hierarchical RNN, in Proc. Conf. Empirical Methods Natural Language Process (Brussels, Belgium), 2018, pp. 4046-4056.
- D. Yogatama, F. Liu, and N. A. Smith, Extractive summarization by maximizing semantic volume, in Proc. Conf. Empirical Methods Natural Language Process (Lisbon, Portugal), 2015, pp. 1961-1966.
- P. Mehta, From extractive to abstractive Summarization: A journey, in Proc. ACL 2016 Student Research Workshop (Berlin, Germany), 2016, pp. 100-106.
- P. Li et al., Deep recurrent generative decoder for abstractive text summarization, in Proc. Conf. Empirical Methods Natural Language Process (Copenhagen, Denmark), 2017, pp. 2091-2100.
- P.-E. Genest and G. Lapalme, Framework for abstractive summarization using text-to-text generation, in Proc. Workshop Monolingual Text-To-Text Generation (Portland, OR, USA), 2011, pp. 64-73.
- J. Cheng and M. Lapata, Neural summarization by extracting sentences and words, in Proc. Annu. Meeting Association Comput. Linguistics (Berlin, Germany), Mar. 2016, pp. 484-494.
- R. Nallapati et al., Abstractive text summarization using sequence-to-sequence rnns and beyond, in Proc. SIGNLL Conf. Comput. Natural Language Learn. (Berlin, Germany), 2016, pp. 280-290.
- Y. Zhang, Q. Liu, and L. Song, Sentence-state LSTM for text representation, in Proc. Annu. Meeting Association Comput. Linguistics (Melbourne, Australia), 2018, pp. 317-327.
- S. Song, H. Huang, and T. Ruan, Abstractive text summarization using LSTM-CNN based deep learning, Multimedia Tools Appl. 78 (2018), 1-19. https://doi.org/10.1007/s11042-018-6670-5
- F. A. Gers, N. N. Schraudolph, and J. Schmidhuber, Learning precise timing with LSTM recurrent networks, J. Machine Learn. Res. 3 (2003), no. 1, 115-143.
- A. Sinha, A. Yadav, and A. Gahlot, Extractive text summarization using neural networks, arXiv Preprint, CoRR, (2018), abs/1802.10137.
- S. Narayan, S. B. Cohen, and M. Lapata, Ranking sentences for extractive summarization with reinforcement learning, in Proc. Conf. North American Chapter Association Comput. Linguistics: Human Language Technol. (New Orleans, LA, USA), 2018, pp. 1747-1759.
- T.A. Bohn and C.X. Ling, Neural sentence location prediction for summarization, arXiv Preprint, CoRR, 2018, abs/1804.08053.
- Q. Zhou et al., Neural document summarization by jointly learning to score and select sentences, in Proc. Annu. Meeting Association Comput. Linguistics (Melbourne, Australia), 2018, pp. 654-663.
- S. Tarnpradab, F. Liu, and K. A. Hua, Toward extractive summarization of online forum discussions via hierarchical attention networks, arXiv Preprint, CoRR, 2018, abs/1805.10390.
- R. Nallapati, F. Zhai, and B. Zhou, Summarunner a recurrent neural network based sequence model for extractive summarization of documents, arXiv Preprint, CoRR, 2016, abs/1611.04230.
- L. Wang et al., Can syntax help? improving an lstm-based sentence compression model for new domains, in Proc. Annu. Meeting Association Comput. Linguistics (Vancouver, Canada), 2017, pp. 1385-1393.
- K. Filippova et al., Sentence compression by deletion with lstms, in Proc. Conf. Empirical Methods Natural Language Process. (Lisbon, Portugal), 2015, pp. 360-368.
- A. M. Rush, S. Chopra, and J. Weston, A neural attention model for abstractive sentence summarization, arXiv Preprint, CoRR, 2015, abs/1509.00685.
- S. Chopra, M. Auli, and A. M. Rush, Abstractive sentence summarization with attentive recurrent neural networks, in Proc. Conf. North American Chapter Association Comput. Linguistics: Human Language Technol. (San Diego, CA, USA), 2016, pp. 93-98.
- K. Al-Sabahi, Z. Zuping, and Y. Kang, Bidirectional attentional encoder-decoder model and bidirectional beam search for abstractive summarization, arXiv Preprint, CoRR, 2018, abs/1809.06662.
- K. Lopyrev, Generating news headlines with recurrent neural networks, arXiv Preprint, CoRR, 2015, abs/1512.01712.
- X. Shi et al., Convolutional LSTM network: A machine learning approach for precipitation nowcasting, in Proc. Int. Conf. Neural Inf. Process. Syst. (Montreal, Canada), 2015, pp. 802-810.
- F. Karim et al., Lstm fully convolutional networks for time series classification, IEEE Access 6 (2018), 1662-1669. https://doi.org/10.1109/access.2017.2779939
- C. A. Colmenares et al., HEADS: Headline generation as sequence prediction using an abstract feature-rich space, in Proc. Conf. North Am. Chapter Association Comput. Linguistics: Human Language Technol. (Denver, CO, USA), 2015, pp. 133-142.
- Z. C. Lipton, A critical review of recurrent neural networks for sequence learning, arXiv Preprint, CoRR, 2015, abs/1506.00019.
- J.-P. Ng and V. Abrecht, Better summarization evaluation with word embeddings for ROUGE, in Proc. Conf. Empirical Methods Natural Language Process. (Lisbon, Portugal), 2015, pp. 1925-1930.
- S. Martschat, and K. Markert, Improving ROUGE for timeline summarization, in Proc. Conf. Eur. Chapter Association Comput. Linguistics (Valencia, Spain), 2017, pp. 285-290.
- J. Tan, X. Wan, and J. Xiao, Abstractive document summarization with a graph-based attentional neural model, in Proc. Annu. Meeting Association Comput. Linguistics (Vancouver, Canada), 2017, pp. 1171-1181.