1、结合了LSTM和注意力机制的ATAE-LSTM。该模型将方面嵌入参与计算注意力权重计算;
(Wang, Y.; Huang, M.; Zhao, L.; Zhu, X. Attention-based lstm for aspect-level sentiment classification.
In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, TX,
USA, 1–5 November 2016; pp. 606–615.)
2、RAM:在双向LSTM上采用了多头注意机制。
(Chen, P.; Sun, Z.; Bing, L.; Yang, W. Recurrent attention network on memory for aspect sentiment analysis.
In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen,
Denmark, 7–11 September 2017; pp. 452–461.)
3、有双向注意机制的模型,该模型分别交互地学习上下文和方面词的注意权重。
(Ma, D.; Li, S.; Zhang, X.; Wang, H. Interactive attention networks for aspect-level sentiment classification.
arXiv 2017, arXiv:1709.00893.)
语义角色标签,机器翻译和关系分类。 一些工作探索图神经网络进行文本分类,他们将文档,句子或单词视为图节点,并依靠节点之间的关系来构建图。
(Liang Yao, Chengsheng Mao, and Yuan Luo. 2018.Graph convolutional networks for text classification.arXiv preprint arXiv:1809.05679.)