AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (5.3 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Open Access

Relation Classification via Recurrent Neural Network with Attention and Tensor Layers

China University of Mining and Technology, Xuzhou 210009, China.
Show Author Information

Abstract

Relation classification is a crucial component in many Natural Language Processing (NLP) systems. In this paper, we propose a novel bidirectional recurrent neural network architecture (using Long Short-Term Memory, LSTM, cells) for relation classification, with an attention layer for organizing the context information on the word level and a tensor layer for detecting complex connections between two entities. The above two feature extraction operations are based on the LSTM networks and use their outputs. Our model allows end-to-end learning from the raw sentences in the dataset, without trimming or reconstructing them. Experiments on the SemEval-2010 Task 8 dataset show that our model outperforms most state-of-the-art methods.

References

[1]
F. M. Suchanek, G. Ifrim, and G. Weikum, Combining linguistic and statistical analysis to extract relations from web documents, in Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Philadelphia, PA, USA, 2006, pp. 712-717.
[2]
K. Xu, Y. S. Feng, S. F. Huang, and D. Y. Zhao, Semantic relation classification via convolutional neural networks with simple negative sampling, in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015.
[3]
Y. Liu, F. Wei, S. J. Li, H. Ji, M. Zhou, and H. F. Wang, A dependency-based neural network for relation classification, in Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, Beijing, China, 2015.
[4]
Y. Xu, L. L. Mou, G. Li, Y. C. Chen, H. Peng, and Z. Jin, Classifying relations via long short term memory networks along shortest dependency paths, in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015, pp. 1785-1794.
[5]
N. T. Vu, H. Adel, P. Gupta, and H. Schütze, Combining recurrent and convolutional neural networks for relation classification, in Proceedings of NAACL-HLT, San Diego, CA, USA, 2016.
[6]
S. Zhang, D. Q. Zheng, X. C. Hu, and M. Yang, Bidirectional long short-term memory networks for relation classification, in Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation, Shanghai, China, 2015, pp. 73-78.
[7]
D. J. Zeng, K. Liu, S. W. Lai, G. Y. Zhou, and J. Zhao, Relation classification via convolutional deep neural network, in Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Paper, Dublin, Ireland, 2014, pp. 2335-2344.
[8]
P. Zhou, W. Shi, J. Tian, Z. Y. Qi, B. C. Li, H. W. Hao, and B. Xu, Attention-based bidirectional long short-term memory networks for relation classification, in Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016, pp. 207-212.
[9]
M. G. Xiao and C. Liu, Semantic relation classification via hierarchical recurrent neural network with attention, in Proceedings of COLING, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 2016, pp. 1254-1263.
[10]
L. L. Wang, Z. Cao, G. de Melo, and Z. Y. Liu, Relation classification via multi-level attention CNNs, in Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016, pp. 1298-1307.
[11]
R. Socher, A. Perelygin, J. Y. Wu, J. Chuang, C. D. Manning, A. Y. Ng, and C. Potts, Recursive deep models for semantic compositionality over a sentiment treebank, in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, WA, USA, 2013, pp. 1631-1642.
[12]
R. Socher, D. Q. Chen, C. D. Manning, and A. Y. Ng, Reasoning with neural tensor networks for knowledge base completion, in Proceedings of the 26th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 2013, pp. 464-469.
[13]
I. Hendrickx, S. N. Kim, Z. Kozareva, P. Nakov, D. Ó. Séaghdha, S. Padó, M. Pennacchiotti, L. Romano, and S. Szpakowicz, SemEval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals, in Proceedings of the Workshop on Semantic Evaluations: Recent Achievements and Future Directions, Boulder, CO, USA, 2009, pp. 94-99.
[14]
R. Socher, B. Huval, C. D. Manning, and A. Y. Ng, Semantic compositionality through recursive matrix-vector spaces, in Proceedings of 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Jeju Island, Korea, 2012, pp. 1201-1211.
[15]
M. Yu, M. R. Gormley, and M. Dredze, Factor-based compositional embedding models, in NIPS Workshop on Learning Semantics, Montreal, Canada, 2014, pp. 95-101.
[16]
Y. Xu, R. Jia, L. L. Mou, G. Li, Y. C. Chen, Y. Y. Lu, and Z. Jin, Improved relation classification by deep recurrent neural networks with data augmentation, in Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 2016.
[17]
D. Bahdanau, K. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate, arXiv preprint arXiv: 1409.0473, 2014.
[18]
X. P. Qiu and X. J. Huang, Convolutional neural tensor network architecture for community-based question answering, in Proceedings of the 24th International Joint Conference on Artificial Intelligence, Buenos Aires, Argentina, 2015, pp. 1305-1311.
[19]
W. Z. Pei, T. Ge, and B. B. Chang, Max-margin tensor neural network for Chinese word segmentation, in Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, Baltimore, MD, USA, 2014, pp. 293-303.
[20]
S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Comput., vol. 9, no. 8, pp. 1735-1780, 1997.
[21]
W. Zaremba, I. Sutskever, and O. Vinyals, Recurrent neural network regularization, arXiv preprint arXiv: 1409.2329, 2014.
[22]
M. T. Luong, H. Pham, and C. D. Manning, Effective approaches to attention-based neural machine translation, in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 2015.
[23]
T. Mikolov, K. Chen, G. Corrado, and J. Dean, Efficient estimation of word representations in vector space, arXiv preprint arXiv: 1301.3781, 2013.
[24]
B. Rink and S. Harabagiu, UTD: Classifying semantic relations by combining lexical and semantic resources, in Proceedings of the 5th International Workshop on Semantic Evaluation, Uppsala, Sweden, 2010, pp. 256-259.
[25]
Y. T. Shen and X. J. Huang, Attention-based convolutional neural network for semantic relation extraction, in Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 2016, pp. 2526-2536.
Big Data Mining and Analytics
Pages 234-244
Cite this article:
Zhang R, Meng F, Zhou Y, et al. Relation Classification via Recurrent Neural Network with Attention and Tensor Layers. Big Data Mining and Analytics, 2018, 1(3): 234-244. https://doi.org/10.26599/BDMA.2018.9020022

805

Views

57

Downloads

47

Crossref

41

Web of Science

52

Scopus

0

CSCD

Altmetrics

Received: 03 February 2018
Accepted: 01 March 2018
Published: 24 May 2018
© The author(s) 2018
Return