%0 Journal Article %A SU Fang %A WANG Xiao-yu %A ZHANG Zhi %T Review Summarization Generation Based on Attention Mechanism %D 2018 %R 10.13190/j.jbupt.2017-219 %J Journal of Beijing University of Posts and Telecommunications %P 7-13 %V 41 %N 3 %X In order to implement generative review summarization, a research of the neural network model of sequence to sequence learning was conducted, besides, an improved attention mechanism for review summarization based on the model was proposed. By focusing on the feature of review summarization samples, the local attention mechanism is improved which has more attention weights on the start of the source sentence. Then every word of the summarization is generated through the end-to-end model. Experiments show that this approach achieves superior performance on English review summarization in the same category when the length of reviews is less than 200. %U https://journal.bupt.edu.cn/EN/10.13190/j.jbupt.2017-219