1 |
P. Li et al., Deep recurrent generative decoder for abstractive text summarization, in Proc. Conf. Empirical Methods Natural Language Process (Copenhagen, Denmark), 2017, pp. 2091-2100.
|
2 |
P.-E. Genest and G. Lapalme, Framework for abstractive summarization using text-to-text generation, in Proc. Workshop Monolingual Text-To-Text Generation (Portland, OR, USA), 2011, pp. 64-73.
|
3 |
J. Cheng and M. Lapata, Neural summarization by extracting sentences and words, in Proc. Annu. Meeting Association Comput. Linguistics (Berlin, Germany), Mar. 2016, pp. 484-494.
|
4 |
R. Nallapati et al., Abstractive text summarization using sequence-to-sequence rnns and beyond, in Proc. SIGNLL Conf. Comput. Natural Language Learn. (Berlin, Germany), 2016, pp. 280-290.
|
5 |
Y. Zhang, Q. Liu, and L. Song, Sentence-state LSTM for text representation, in Proc. Annu. Meeting Association Comput. Linguistics (Melbourne, Australia), 2018, pp. 317-327.
|
6 |
S. Song, H. Huang, and T. Ruan, Abstractive text summarization using LSTM-CNN based deep learning, Multimedia Tools Appl. 78 (2018), 1-19.
DOI
|
7 |
F. A. Gers, N. N. Schraudolph, and J. Schmidhuber, Learning precise timing with LSTM recurrent networks, J. Machine Learn. Res. 3 (2003), no. 1, 115-143.
|
8 |
A. Sinha, A. Yadav, and A. Gahlot, Extractive text summarization using neural networks, arXiv Preprint, CoRR, (2018), abs/1802.10137.
|
9 |
S. Narayan, S. B. Cohen, and M. Lapata, Ranking sentences for extractive summarization with reinforcement learning, in Proc. Conf. North American Chapter Association Comput. Linguistics: Human Language Technol. (New Orleans, LA, USA), 2018, pp. 1747-1759.
|
10 |
T.A. Bohn and C.X. Ling, Neural sentence location prediction for summarization, arXiv Preprint, CoRR, 2018, abs/1804.08053.
|
11 |
L. Wang et al., Can syntax help? improving an lstm-based sentence compression model for new domains, in Proc. Annu. Meeting Association Comput. Linguistics (Vancouver, Canada), 2017, pp. 1385-1393.
|
12 |
Q. Zhou et al., Neural document summarization by jointly learning to score and select sentences, in Proc. Annu. Meeting Association Comput. Linguistics (Melbourne, Australia), 2018, pp. 654-663.
|
13 |
S. Tarnpradab, F. Liu, and K. A. Hua, Toward extractive summarization of online forum discussions via hierarchical attention networks, arXiv Preprint, CoRR, 2018, abs/1805.10390.
|
14 |
R. Nallapati, F. Zhai, and B. Zhou, Summarunner a recurrent neural network based sequence model for extractive summarization of documents, arXiv Preprint, CoRR, 2016, abs/1611.04230.
|
15 |
K. Filippova et al., Sentence compression by deletion with lstms, in Proc. Conf. Empirical Methods Natural Language Process. (Lisbon, Portugal), 2015, pp. 360-368.
|
16 |
A. M. Rush, S. Chopra, and J. Weston, A neural attention model for abstractive sentence summarization, arXiv Preprint, CoRR, 2015, abs/1509.00685.
|
17 |
X. Shi et al., Convolutional LSTM network: A machine learning approach for precipitation nowcasting, in Proc. Int. Conf. Neural Inf. Process. Syst. (Montreal, Canada), 2015, pp. 802-810.
|
18 |
S. Chopra, M. Auli, and A. M. Rush, Abstractive sentence summarization with attentive recurrent neural networks, in Proc. Conf. North American Chapter Association Comput. Linguistics: Human Language Technol. (San Diego, CA, USA), 2016, pp. 93-98.
|
19 |
K. Al-Sabahi, Z. Zuping, and Y. Kang, Bidirectional attentional encoder-decoder model and bidirectional beam search for abstractive summarization, arXiv Preprint, CoRR, 2018, abs/1809.06662.
|
20 |
K. Lopyrev, Generating news headlines with recurrent neural networks, arXiv Preprint, CoRR, 2015, abs/1512.01712.
|
21 |
J.-P. Ng and V. Abrecht, Better summarization evaluation with word embeddings for ROUGE, in Proc. Conf. Empirical Methods Natural Language Process. (Lisbon, Portugal), 2015, pp. 1925-1930.
|
22 |
F. Karim et al., Lstm fully convolutional networks for time series classification, IEEE Access 6 (2018), 1662-1669.
DOI
|
23 |
C. A. Colmenares et al., HEADS: Headline generation as sequence prediction using an abstract feature-rich space, in Proc. Conf. North Am. Chapter Association Comput. Linguistics: Human Language Technol. (Denver, CO, USA), 2015, pp. 133-142.
|
24 |
Z. C. Lipton, A critical review of recurrent neural networks for sequence learning, arXiv Preprint, CoRR, 2015, abs/1506.00019.
|
25 |
S. Martschat, and K. Markert, Improving ROUGE for timeline summarization, in Proc. Conf. Eur. Chapter Association Comput. Linguistics (Valencia, Spain), 2017, pp. 285-290.
|
26 |
J. Tan, X. Wan, and J. Xiao, Abstractive document summarization with a graph-based attentional neural model, in Proc. Annu. Meeting Association Comput. Linguistics (Vancouver, Canada), 2017, pp. 1171-1181.
|
27 |
P. Mehta, From extractive to abstractive Summarization: A journey, in Proc. ACL 2016 Student Research Workshop (Berlin, Germany), 2016, pp. 100-106.
|
28 |
J. Chen and H. Zhuge, Abstractive text-image summarization using multi-modal attentional hierarchical RNN, in Proc. Conf. Empirical Methods Natural Language Process (Brussels, Belgium), 2018, pp. 4046-4056.
|
29 |
D. Yogatama, F. Liu, and N. A. Smith, Extractive summarization by maximizing semantic volume, in Proc. Conf. Empirical Methods Natural Language Process (Lisbon, Portugal), 2015, pp. 1961-1966.
|