skip to main content
research-article

HARP: A Novel Hierarchical Attention Model for Relation Prediction

Published: 04 January 2021 Publication History

Abstract

Recent years have witnessed great advancement of representation learning (RL)-based models for the knowledge graph relation prediction task. However, they generally rely on structure information embedded in the encyclopedic knowledge graph, while the beneficial semantic information provided by lexical knowledge graph is ignored, leading the problem of shallow understanding and coarse-grained analysis for knowledge acquisition. Therefore, this article introduces concept information derived from the lexical knowledge graph (e.g., Probase), and proposes a novel Hierarchical Attention model for Relation Prediction, which consists of entity-level attention mechanism and concept-level attention mechanism, to throughly integrate multiple semantic signals. Experimental results demonstrate the efficiency of the proposed method on two benchmark datasets.

References

[1]
Bo An, Bo Chen, Xianpei Han, and Le Sun. 2018. Accurate text-enhanced knowledge graph representation learning. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.
[2]
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. Eprint Arxiv (2014).
[3]
Osman Balci and Robert G. Sargent. 1981. A methodology for cost-risk analysis in the statistical validation of simulation models. Communications of the ACM 24, 4 (1981), 190--197.
[4]
Kurt D. Bollacker, Colin Evans, Praveen Paritosh, Tim Sturge, and Jamie Taylor. 2008. Freebase: A collaboratively created graph database for structuring human knowledge. In Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data.
[5]
Antoine Bordes, Xavier Glorot, Jason Weston, and Yoshua Bengio. 2012. Joint learning of words and meaning representations for open-text semantic parsing. In Proceedings of the 15th International Conference on Artificial Intelligence and Statistics.
[6]
Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. 2013. Translating embeddings for modeling multi-relational data. In Proceedings of the 26th International Conference on Neural Information Processing Systems. 2787--2795.
[7]
Antoine Bordes, Jason Weston, Ronan Collobert, and Yoshua Bengio. 2011. Learning structured embeddings of knowledge bases. In Proceedings of the 25th AAAI Conference on Artificial Intelligence. 301--306.
[8]
Volha Bryl, Sara Tonelli, Claudio Giuliano, and Luciano Serafini. 2012. A novel Framenet-based resource for the semantic web. In Proceedings of the 27th Annual ACM Symposium on Applied Computing.
[9]
Lu Lin Cheng, Ruocheng Guo, Yasin N. Silva, Deborah L. Hall, and Huan Liu. 2019. Hierarchical attention networks for cyberbullying detection on the instagram social network. In Proceedings of the 19th SIAM International Conference on Data Mining.
[10]
Shaozhi Dai, Yanchun Liang, Shuyan Liu, Ying Wang, Wenle Shao, Xixun Lin, and Xiaoyue Feng. 2018. Learning entity and relation embeddings with entity description for knowledge graph completion. In Proceedings of the 29th AAAI Conference on Artificial Intelligence.
[11]
A. Adam Ding, Cong Chen, and Thomas Eisenbarth. 2015. Simpler, faster, and more robust T-test based leakage detection. In Proceedings of the 2015 International Workshop on Constructive Side-Channel Analysis and Secure Design.
[12]
Xin Dong, Evgeniy Gabrilovich, Geremy Heitz, Wilko Horn, Ni Lao, Kevin Murphy, Thomas Strohmann, Shaohua Sun, and Wei Zhang. 2014. Knowledge vault: A web-scale approach to probabilistic knowledge fusion. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 601--610.
[13]
Meng Joo Er, Yong Zhang, Ning Wang, and Mahardhika Pratama. 2016. Attention pooling-based convolutional neural network for sentence modelling. Information Sciences 373, 3 (2016), 388--403.
[14]
Miao Fan, Qiang Zhou, Thomas Fang Zheng, and Ralph Grishman. 2017. Distributed representation learning for knowledge graphs with entity descriptions. Pattern Recognition Letters 93, 5 (2017), 31--37.
[15]
Marco Gaboardi and Ryan Rogers. 2018. Local private hypothesis testing: Chi-square tests. ArXiv abs/1709.07155 (2018).
[16]
Shang Gao, Michael T. Young, John X. Qiu, Hong-Jun Yoon, James B. Christian, Paul A. Fearn, Georgia D. Tourassi, and Arvind Ramanthan. 2018. Hierarchical attention networks for information extraction from cancer pathology reports. Journal of the American Medical Informatics Association 25, 3 (2018), 321--330.
[17]
G. H. Golub and C. Reinsch. 1970. Singular value decomposition and least squares solutions. Numerische Mathematik 14, 5 (1970), 403--420.
[18]
Xu Han, Zhiyuan Liu, and Maosong Sun. 2018. Neural knowledge acquisition via mutual attention between knowledge graph and text. In Proceedings of the 2018 AAAI Conference on Artificial Intelligence.
[19]
Zhizhong Han, Honglei Lu, Zhenbao Liu, Chi-Man Vong, Yushen Liu, Matthias Zwicker, Junwei Han, and C. L. Philip Chen. 2019. 3D2SeqViews: Aggregating sequential views for 3D global feature learning by CNN with hierarchical attention aggregation. IEEE Transactions on Image Processing 28, 8 (2019), 3986--3999.
[20]
Ruidan He, Wee Sun Lee, Hwee Tou Ng, and Daniel Dahlmeier. 2017. An unsupervised neural attention model for aspect extraction. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Vol. 1. 388--397.
[21]
Ruining He and Julian J. McAuley. 2015. VBPR: Visual Bayesian personalized ranking from implicit feedback. In Proceedings of the 2015 AAAI Conference on Artificial Intelligence.
[22]
Karl Moritz Hermann, Tomás Kociský, Edward Grefenstette, Lasse Espeholt, Will Kay, Mustafa Suleyman, and Phil Blunsom. 2015. Teaching machines to read and comprehend. In Proceedings of the 2015 International Conference on Neural Information Processing Systems.
[23]
Binbin Hu, Zhiqiang Zhang, Chuan Shi, Jun Zhou, Xiaolong Li, and Yuan Qi. 2019. Cash-out user detection based on attributed heterogeneous information network with a hierarchical attention mechanism. In Proceedings of the 2019 AAAI Conference on Artificial Intelligence.
[24]
Wen Hua, Zhongyuan Wang, Haixun Wang, Kai Zheng, and Xiaofang Zhou. 2015. Short text understanding through lexical-semantic analysis. In Proceedings of the IEEE International Conference on Data Engineering. 495--506.
[25]
Heyan Huang, Yashen Wang, Chong Feng, Zhirun Liu, and Qiang Zhou. 2018. Leveraging conceptualization for short-text embedding. IEEE Transactions on Knowledge and Data Engineering 30, 7 (2018), 1282--1295.
[26]
Katherine R. B. Jankowski, Kevin J. Flannelly, and Laura T. Flannelly. 2018. The t-test: An influential inferential tool in chaplaincy and other healthcare research. Journal of Health Care Chaplaincy 24, 1 (2018), 30.
[27]
Guoliang Ji, Shizhu He, Liheng Xu, Kang Liu, and Jian Zhao. 2015. Knowledge graph embedding via dynamic mapping matrix. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics.
[28]
So-Young Jun, Dinara Aliyeva, Ji-Min Lee, and Sangkeun Lee. 2018. Utilizing probase in open directory project-based text classification. Proceedings of the 2018 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE ’18). 1--7.
[29]
Santosh Kabbur, Xia Ning, and George Karypis. 2013. FISM: Factored item similarity models for top-N recommender systems. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.
[30]
Denis Krompa, Stephan Baier, and Volker Tresp. 2015. Type-constrained representation learning in knowledge graphs. In Proceedings of the International Conference on the Semantic Web - ISWC. 640--655.
[31]
Ankit Kumar, Ozan Irsoy, Peter Ondruska, Mohit Iyyer, James Bradbury, Ishaan Gulrajani, Victor Zhong, Romain Paulus, and Richard Socher. 2016. Ask me anything: Dynamic memory networks for natural language processing. In Proceedings of the 33rd International Conference on International Conference on Machine Learning.
[32]
Ji Young Lee, Franck Dernoncourt, and Peter Szolovits. 2017. MIT at SemEval-2017 task 10: Relation extraction with convolutional neural networks. ArXiv abs/1704.01523 (2017).
[33]
Peipei Li, Haixun Wang, Kenny Q. Zhu, Zhongyuan Wang, and Xindong Wu. 2013. Computing term similarity by large probabilistic isa knowledge. In Proceedings of the 22nd ACM International Conference on Conference on Information 8 Knowledge Management. ACM, 1401--1410.
[34]
Zheng Li, Ya lin Wei, Yuzhi Zhang, and Qiang Yang. 2018. Hierarchical attention transfer network for cross-domain sentiment classification. In Proceedings of the 2018 AAAI Conference on Artificial Intelligence.
[35]
Zhaohui Li, Jun Xu, Yanyan Lan, Jiafeng Guo, Yue Feng, and Xueqi Cheng. 2018. Hierarchical answer selection framework for multi-passage machine reading comprehension. In Proceedings of the 2018 China Conference on Information Retrieval.
[36]
Yuan Liang, Fei Xu, Song Hai Zhang, Yu Kun Lai, and Taijiang Mu. 2018. Knowledge graph construction with structure and parameter learning for indoor scene design. Computational Visual Media 4, 2 (2018), 1--15.
[37]
Xinshi Lin, Wai Lam, and Kwun Ping Lai. 2018. Entity retrieval in the knowledge graph with hierarchical entity type and content. Proceedings of the 2018 ACM SIGIR International Conference on Theory of Information Retrieval.
[38]
Yankai Lin, Zhiyuan Liu, Huan-Bo Luan, Maosong Sun, Siwei Rao, and Song Liu. 2015. Modeling relation paths for representation learning of knowledge bases. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.
[39]
Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, and Xuan Zhu. 2015. Learning entity and relation embeddings for knowledge graph completion. In Proceedings of the 2015 AAAI Conference on Artificial Intelligence. 2181--2187.
[40]
Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huanbo Luan, and Maosong Sun. 2016. Neural relation extraction with selective attention over instances. In Proceedings of the Association for Computational Linguistics. Vol. 1. 2124--2133.
[41]
Wei Liu, Lei Zhang, Longxuan Ma, Pengfei Wang, and Feng Zhang. 2019. Hierarchical multi-dimensional attention model for answer selection. Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN’19). 1--8.
[42]
Yang Liu, Zhiyuan Liu, Tat-Seng Chua, and Maosong Sun. 2015. Topical word embeddings. In Proceedings of the 29th AAAI Conference on Artificial Intelligence.
[43]
Teng Long, Ryan Lowe, Jackie Chi Kit Cheung, and Doina Precup. 2016. Leveraging lexical resources for learning entity embeddings in multi-relational data. CoRR abs/1605.05416 (2016).
[44]
Thang Luong, Hieu Pham, and Christopher D. Manning. 2015. Effective approaches to attention-based neural machine translation. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.
[45]
Jing Ma, Wei Gao, Shafiq R. Joty, and Kam-Fai Wong. 2019. Sentence-level evidence embedding for claim verification with hierarchical attention networks. In Proceedings of the 2019 Annual Meeting of the Association for Computational Linguistics.
[46]
Shiheng Ma, Jianhui Ding, Weijia Jia, Kun Wang, and Minyi Guo. 2017. TransT: Type-based multiple embedding representations for knowledge graph completion. In Proceedings of the 2017 Joint European Conference on Machine Learning and Knowledge Discovery in Databases. 717--733.
[47]
Changsung Moon, Paul Jones, and Nagiza F. Samatova. 2017. Learning entity type embeddings for knowledge graph completion. Proceedings of the 2017 ACM on Conference on Information and Knowledge Management.
[48]
Maximilian Nickel, Lorenzo Rosasco, and Tomaso Poggio. 2016. Holographic embeddings of knowledge graphs. In Proceedings of the 13th AAAI Conference on Artificial Intelligence. 1955--1961.
[49]
Maximilian Nickel, Volker Tresp, and Hans Peter Kriegel. 2011. A three-way model for collective learning on multi-relational data. In Proceedings of the 2011 International Conference on International Conference on Machine Learning. 809--816.
[50]
Weike Pan, Hao Zhong, Congfu Xu, and Zhong Ming. 2015. Adaptive Bayesian personalized ranking for heterogeneous implicit feedbacks. Knowledge-Based Systems 73, 1 (2015), 173--180.
[51]
Jin-woo Park, Seung-won Hwang, and Haixun Wang. 2016. Fine-grained semantic conceptualization of FrameNet. In Proceedings of the 2016 AAAI Conference on Artificial Intelligence. 2638--2644.
[52]
Wang Quan, Zhendong Mao, Bin Wang, and Guo Li. 2017. Knowledge graph embedding: A survey of approaches and applications. IEEE Transactions on Knowledge and Data Engineering 29, 12 (2017), 2724--2743.
[53]
Rakesh Rana and Richa Singhal. 2015. Chi-square test and its application in hypothesis testing. Journal of the Practice of Cardiovascular Sciences 1, 1 (2015), 69--71.
[54]
Steffen Rendle, Christoph Freudenthaler, Zeno Gantner, and Lars Schmidt-Thieme. 2009. BPR: Bayesian personalized ranking from implicit feedback. In Proceedings of the 2009 Conference on Uncertainty in Artificial Intelligence.
[55]
Robert G. Sargent. 2011. Verification and validation of simulation models. Proceedings of the 2011 Winter Simulation Conference (WSC’11). 183--198.
[56]
Qi shan Wang, Zhendong Mao, Biwu Wang, and Li Guo. 2017. Knowledge graph embedding: A survey of approaches and applications. IEEE Transactions on Knowledge and Data Engineering 29, 12 (2017), 2724--2743.
[57]
Baoxu Shi and Tim Weninger. 2016. ProjE: Embedding projection for knowledge graph completion. In Proceedings of the 2016 AAAI Conference on Artificial Intelligence.
[58]
Yanpei Shi, Qiang Huang, and Thomas Hain. 2019. H-VECTORS: Utterance-level speaker embedding using a hierarchical attention model. ArXiv abs/1910.07900 (2019).
[59]
Masumi Shirakawa, Haixun Wang, Yangqiu Song, Zhongyuan Wang, Kotaro Nakayama, Takahiro Hara, and Shojiro Nishio. 2011. Entity disambiguation based on a probabilistic taxonomy. Technical Report MSR-TR-2011-125.
[60]
Richard Socher, Danqi Chen, Christopher D. Manning, and Andrew Ng. 2013. Reasoning with neural tensor networks for knowledge base completion. In Proceedings of the 26th International Conference on Neural Information Processing Systems. 926--934.
[61]
Meina Song, Qing Liu, and E. Haihong. 2018. Deep hierarchical attention networks for text matching in information retrieval. Proceedings of the 2018 International Conference on Information Systems and Computer Aided Education (ICISCAE’18). 476--481.
[62]
Yangqiu Song, Haixun Wang, Zhongyuan Wang, Hongsong Li, and Weizhu Chen. 2011. Short text conceptualization using a probabilistic knowledgebase. In Proceedings of the 22nd International Joint Conference on Artificial Intelligence. 2330--2336.
[63]
Yangqiu Song, Shusen Wang, and Haixun Wang. 2015. Open domain short text conceptualization: A generative + descriptive modeling approach. In Proceedings of the 2015 International Conference on Artificial Intelligence. 3820--3826.
[64]
Yangqiu Song, Shusen Wang, and Haixun Wang. 2015. Open domain short text conceptualization: A generative + descriptive modeling approach. In Proceedings of the 24th International Conference on Artificial Intelligence.
[65]
Daniil Sorokin and Iryna Gurevych. 2017. Context-aware representations for knowledge base relation extraction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing.
[66]
Antoine J.-P. Tixier. 2018. Notes on deep learning for NLP. ArXiv abs/1808.09772 (2018).
[67]
Theo Trouillon, Johannes Welbl, Sebastian Riedel, Éric Gaussier, and Guillaume Bouchard. 2016. Complex embeddings for simple link prediction. In Proceedings of the 33rd International Conference on International Conference on Machine Learning.
[68]
Shikhar Vashishth, Rishabh Joshi, Sai Suman Prayaga, Chiranjib Bhattacharyya, and Partha Talukdar. 2018. RESIDE: Improving distantly-supervised neural relation extraction using side information. ArXiv abs/1812.04361 (2018).
[69]
Fang Wang, Zhongyuan Wang, Zhoujun Li, and Ji Rong Wen. 2014. Concept-based short text classification and ranking. In Proceedings of the ACM International Conference. 1069--1078.
[70]
Weixuan Wang, Zhihong Chen, and Haifeng Hu. 2019. Hierarchical attention network for image captioning. In Proceedings of the 2019 AAAI Conference on Artificial Intelligence.
[71]
Wei Wang, Chen Wu, and Ming Yan. 2018. Multi-granularity hierarchical attention fusion networks for reading comprehension and question answering. ArXiv abs/1811.11934 (2018).
[72]
Xiaohua Wang and Haibin Duan. 2019. Hierarchical visual attention model for saliency detection inspired by avian visual pathways. IEEE/CAA Journal of Automatica Sinica 6, 2 (2019), 540--552.
[73]
Yashen Wang, Heyan Huang, and Chong Feng. 2017. Query expansion based on a feedback concept model for microblog retrieval. In Proceedings of the International Conference on World Wide Web. 559--568.
[74]
Yashen Wang, Heyan Huang, Chong Feng, Qiang Zhou, Jiahui Gu, and Xiong Gao. 2016. CSE: Conceptual sentence embeddings based on attention model. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 505--515.
[75]
Yashen Wang, Yifeng Liu, Huanhuan Zhang, and Haiyong Xie. 2019. Leveraging lexical semantic information for learning concept-based multiple embedding representations for knowledge graph completion. In Proceedings of the Asia-Pacific Web (APWeb) and Web-Age Information Management (WAIM) Joint International Conference on Web and Big Data.
[76]
Yashen Wang, Huanhuan Zhang, Yifeng Li, and Haiyong Xie. 2019. Simplified representation learning model based on parameter-sharing for knowledge graph completion. In Proceedings of the 2019 China Conference on Information Retrieval.
[77]
Zhen Wang, Jianwen Zhang, Jianlin Feng, and Zheng Chen. 2014. Knowledge graph and text jointly embedding. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 1591--1601.
[78]
Zhen Wang, Jianwen Zhang, Jianlin Feng, and Zheng Chen. 2014. Knowledge graph embedding by translating on hyperplanes. In Proceedings of the 2014 AAAI Conference on Artificial Intelligence. 1112--1119.
[79]
Zhongyuan Wang, Kejun Zhao, Haixun Wang, Xiaofeng Meng, and Ji Rong Wen. 2015. Query understanding through knowledge-based conceptualization. In Proceedings of the 2015 International Conference on Artificial Intelligence. 3264--3270.
[80]
Xiu-Shen Wei, Chen-Lin Zhang, Lingqiao Liu, Chunhua Shen, and Jianxin Wu. 2018. Coarse-to-fine: A RNN-based hierarchical attention model for vehicle re-identification. ArXiv abs/1812.04239 (2018).
[81]
Lesly Miculicich Werlen, Dhananjay Ram, Nikolaos Pappas, and James Henderson. 2018. Document-level neural machine translation with hierarchical attention networks. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing.
[82]
Wentao Wu, Hongsong Li, Haixun Wang, and Kenny Q. Zhu. 2012. Probase: A probabilistic taxonomy for text understanding. In Proceedings of the 2012 ACM SIGMOD International Conference on Management of Data. 481--492.
[83]
Wentao Wu, Hongsong Li, Haixun Wang, and Kenny Q. Zhu. 2012. Probase: A probabilistic taxonomy for text understanding. In Proceedings of the 2012 ACM SIGMOD International Conference on Management of Data.
[84]
Han Xiao, Minlie Huang, Lian Meng, and Xiaoyan Zhu. 2017. SSP: Semantic space projection for knowledge graph embedding with text descriptions. In Proceedings of the 2017 AAAI Conference on Artificial Intelligence.
[85]
Han Xiao, Minlie Huang, and Xiaoyan Zhu. 2016. TransG: A generative model for knowledge graph embedding. In Proceedings of the 2016 Meeting of the Association for Computational Linguistics. 2316--2325.
[86]
Ruobing Xie, Zhiyuan Liu, J. J. Jia, Huanbo Luan, and Maosong Sun. 2016. Representation learning of knowledge graphs with entity descriptions. In Proceedings of the 2016 AAAI Conference on Artificial Intelligence.
[87]
Ruobing Xie, Zhiyuan Liu, and Maosong Sun. 2016. Representation learning of knowledge graphs with hierarchical types. In Proceedings of the 2016 International Joint Conference on Artificial Intelligence. 2965--2971.
[88]
Shuning Xing, Fang’ai Liu, Qianqian Wang, Xiaohui Zhao, and Tianlai Li. 2019. A hierarchical attention model for rating prediction by leveraging user and product reviews. Neurocomputing 332, 1 (2019), 417--427.
[89]
Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron C. Courville, Ruslan Salakhutdinov, Richard S. Zemel, and Yoshua Bengio. 2015. Show, attend and tell: Neural image caption generation with visual attention. In Proceedings of the 2015 International Conference on International Conference on Machine Learning.
[90]
Peng Xu and Denilson Barbosa. 2019. Connecting language and knowledge with heterogeneous representations for neural relation extraction. ArXiv abs/1903.10126 (2019).
[91]
Min-Chul Yang, Nan Duan, Ming Zhou, and Hae-Chang Rim. 2014. Joint relational embeddings for knowledge-based question answering. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 645--650.
[92]
Zichao Yang, Diyi Yang, Chris Dyer, Xiaodong He, Alexander J. Smola, and Eduard H. Hovy. 2016. Hierarchical attention networks for document classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.
[93]
Tay Yi, Anh Tuan Luu, and Siu Cheung Hui. 2017. Non-parametric estimation of multiple embeddings for link prediction on dynamic knowledge graphs. In Proceedings of the 31st Conference on Artificial Intelligence.
[94]
Haochao Ying, Fuzhen Zhuang, Fuzheng Zhang, Yanchi Liu, Guandong Xu, Xing Xie, Hui Xiong, and Jian Wu. 2018. Sequential recommender system based on hierarchical attention networks. In Proceedings of the 27th International Joint Conference on Artificial Intelligence.
[95]
Quanzeng You, Hailin Jin, Zhaowen Wang, Chen Fang, and Jiebo Luo. 2016. Image captioning with semantic attention. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16) . 4651--4659.
[96]
Dongxu Zhang, Bin Yuan, Dong Kai Wang, and Rong Liu. 2015. Joint semantic relevance learning with text data and graph knowledge. In Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality.
[97]
Huaping Zhong, Jianwen Zhang, Zhen Wang, Hai Wan, and Zhigang Chen. 2015. Aligning knowledge and text embeddings by entity descriptions. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.

Cited By

View all

Index Terms

  1. HARP: A Novel Hierarchical Attention Model for Relation Prediction

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Knowledge Discovery from Data
      ACM Transactions on Knowledge Discovery from Data  Volume 15, Issue 2
      Survey Paper and Regular Papers
      April 2021
      524 pages
      ISSN:1556-4681
      EISSN:1556-472X
      DOI:10.1145/3446665
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 January 2021
      Accepted: 01 September 2020
      Revised: 01 July 2020
      Received: 01 February 2020
      Published in TKDD Volume 15, Issue 2

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Relation prediction
      2. concept information
      3. hierarchical attention model
      4. knowledge graph
      5. representation learning

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Funding Sources

      • National Integrated Big Data Center Pilot Project
      • China Postdoctoral Science Foundation
      • New Generation of Artificial Intelligence Special Action Project
      • National Natural Science Foundation of China
      • National Key Research and Development Project
      • Joint Advanced Research Foundation of China Electronics Technology Group Corporation (CETC)

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)37
      • Downloads (Last 6 weeks)7
      Reflects downloads up to 14 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media