Scrutinizing Label: Contrastive Learning on Label Semantics and Enriched Representation for Relation Extraction

Nguyen T, Grishman R. Event detection and domain adaptation with convolutional neural networks. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing. 2015. p. 365-71.

Lample G, Ballesteros M, Subramanian S, Kawakami K, Dyer C. Neural architectures for named entity recognition. In: Proceedings of the 2016 conference of the north american chapter of the association for computational linguistics: human language technologies. 2016. p. 260-70.

Zha E, Zeng D, Lin M, Shen Y. CEPTNER: contrastive learning enhanced prototypical network for two-stage few-shot named entity recognition. Knowl-Based Syst. 2024;295:111730.

Article  Google Scholar 

Chen W, Hong D, Zheng C. Learning knowledge graph embedding with entity descriptions based on LSTM networks. In: 2020 IEEE International Symposium on Product Compliance Engineering-Asia (ISPCE-CN). 2020. p. 1-7.

Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A, et al. Attention is all you need. In: Advances in neural information processing systems, vol. 30. 2017. p. 6000–10.

Devlin J, Chang M, Lee K, Toutanova K. BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies. 2019. p. 4171-86.

Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, et al. RoBERTa: a robustly optimized BERT pretraining approach. 2019. arXiv:1907.11692

Zhou W, Chen M. An improved baseline for sentence-level relation extraction. In: Proceedings of the 2nd conference of the Asia-Pacific chapter of the association for computational linguistics and the 12th international joint conference on natural language processing. 2022. p. 161-8.

Wang X, Gao T, Zhu Z, Zhang Z, Liu Z, Li J, et al. KEPLER: a unified model for knowledge embedding and pre-trained language representation. Trans Assoc Comput Linguistics. 2021;9:176–94.

Article  Google Scholar 

Baldini Soares L, FitzGerald N, Ling J, Kwiatkowski T. Matching the blanks: distributional similarity for relation learning. In: Proceedings of the 57th annual meeting of the association for computational linguistics. 2019. p. 2895-2905.

Wu S, He Y. Enriching pre-trained language model with entity information for relation classification In: Proceedings of the 28th ACM international conference on information and knowledge management. 2019. p. 2361-64.

Li Z, Sharaf M, Sitbon L, Du X, Zhou X. CoRE: a context-aware relation extraction method for relation completion. In: 2023 Third International Conference on Advances in Electrical, Computing, Communication and Sustainable Technologies (ICAECT). 2023. p. 1-4.

Huang J, Li B, Xu J, Chen M. Unified semantic typing with meaningful label inference. In: Proceedings of the 2022 conference of the north american chapter of the association for computational linguistics: human language technologies. 2022. p. 2642-54.

Reimers N, Gurevych I. Sentence-BERT: sentence embeddings using siamese BERT-networks. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019. p. 3980-90.

Nayak Y, Majumder N, Goyal P, Poria S. Deep neural approaches to relation triplets extraction: a comprehensive survey. Cogn Comput. 2021;5(13):1215–32.

Article  Google Scholar 

Mondal A, Cambria E, Das D, Hussain A, Bandyopadhyay S. Relation extraction of medical concepts using categorization and sentiment analysis. Cogn Comput. 2018;10:670–85.

Article  Google Scholar 

Peng H, Gao T, Han X, Lin Y, Li P, Liu Z, et al. Learning from context or names? An empirical study on neural relation extraction. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2020. p. 3661-72.

Hu M, Zhang C, Ma F, Liu C, Wen L, Yu P. Semi-supervised relation extraction via incremental meta self-training. In: Findings of the association for computational linguistics: EMNLP 2021. 2021. p. 487-96.

Gao T, Yao H, Chen D. SimCSE: simple contrastive learning of sentence embeddings. In: Proceedings of the 2021 conference on empirical methods in natural language processing. 2021. p. 6894-910.

Khosla P, Teterwak P, Wang C, Sarna A, Tian Y, Isola P, et al. Supervised contrastive learning. In: Proceedings of the 34th international conference on neural information processing systems, vol. 33. 2020. p. 18661-73.

Nguyen D, Matsuo Y, Ishizuka M. Subtree mining for relation extraction from Wikipedia. In: Human language technologies 2007: the conference of the North American chapter of the association for computational linguistics. 2007. p. 125-28.

Liu C, Sun W, Chao W, Che W. Convolution neural network for relation extraction. Advan Data Mining Appl. 2013;8347:231–42.

Article  Google Scholar 

Zeng D, Liu K, Lai S, Zhou G, Zhao J. Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics. 2014. p. 2335-44.

Nguyen T, Grishman R. Relation extraction: perspective from convolutional neural networks. In: Proceedings of the 1st workshop on vector space modeling for natural language processing. 2015. p. 39–48.

Zhang R, Meng F, Zhou Y, Liu B. Relation classification via recurrent neural network with attention and tensor layers. Big Data Mining Anal. 2018;3(1):234–44.

Article  Google Scholar 

Xu Y, Mou L, Li G, Chen Y, Peng H, Jin Z. Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 conference on empirical methods in natural language processing. 2015. p. 1785-94.

Xu S, Sun S, Zhang Z, Xu F, Liu J. BERT gated multi-window attention network for relation extraction. Neurocomputing. 2022;492:516–29.

Article  Google Scholar 

Peters M, Neumann M, Logan R, Schwartz R, Joshi V, Singh S, et al. Knowledge enhanced contextual word representations. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019. p. 43-54.

Yamada I, Asai A, Shindo H, Takeda H, Matsumoto Y. LUKE: deep contextualized entity representations with entity-aware self-attention. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2020. p. 6442-54.

Li C, Tian Y. Downstream model design of pre-trained language model for relation extraction task. 2020. arXiv:2004.03786

Hadsell R, Chopra S, LeCun Y. Dimensionality reduction by learning an invariant mapping. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06). 2006. p. 1735-42.

Liu J, Liu J, Wang Q, Wang J, Wu W, Xian Y, et al. RankCSE: unsupervised sentence representations learning via learning to rank. In: Proceedings of the 61st annual meeting of the association for computational linguistics. 2023. p. 13785-802.

Gunel B, Du J, Conneau A, Stoyanov V. Supervised contrastive learning for pre-trained language model fine-tuning. 2021. arXiv:2011.01403

Chen T, Shi H, Tang S, Chen Z, Wu F, Zhuang Y. CIL: contrastive instance learning framework for distantly supervised relation extraction. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing. 2021. p. 6191–200.

Zhu X, Meng Q, Ding B, Gu L, Yang Y. Weighted pooling for image recognition of deep convolutional neural networks. Clust Comput. 2019;22(Suppl 4):9371–83.

Article  Google Scholar 

Zhang Y, Zhong V, Chen D, Angeli G, Manning C. Position-aware attention and supervised data improve slot filling. In: Proceedings of the 2017 conference on empirical methods in natural language processing. 2017. p. 33-45.

Alt C, Gabryszak A, Hennig L. TACRED revisited: a thorough evaluation of the TACRED relation extraction task. In: Proceedings of the 58th annual meeting of the association for computational linguistics. 2020. p. 1558-69.

Stoica G, Platanios E, Poczos B. Re-TACRED: addressing shortcomings of the TACRED dataset. In: Proceedings of the AAAI conference on artificial intelligence, vol. 35. 2021. p. 13843-50.

Zhang Y, Qi P, Manning C. Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of the 2018 conference on empirical methods in natural language processing. 2018. p. 2205-15.

Kipf T, Welling M, Manning C. Semi-supervised classification with graph convolutional networks. 2017. arXiv:1609.02907

Joshi M, Chen D, Liu Y, Weld D, Zettlemoyer L, et al. SpanBERT: improving pre-training by representing and predicting spans. Trans Assoc Comput Linguistics. 2020;8:64–77.

Article  Google Scholar 

Yamamoto Y, Matsuzaki T. Absolute position embedding learns sinusoid-like waves for attention based on relative position. In: Proceedings of the 2023 conference on empirical methods in natural language processing. 2023. p. 15-28.

Klein T, Nabi M. miCSE: mutual information contrastive learning for low-shot sentence embeddings. In: Proceedings of the 61st annual meeting of the association for computational linguistics. 2023. p. 6159-77.

Zhuang J, Jing X, Jia X. Mining negative samples on contrastive learning via curricular weighting strategy. Inf Sci. 2024; 668:120534.

Wang T, Chen L, Zhu X, Lee Y, Gao J. Weighted contrastive learning with false negative control to help long-tailed product classification. In: Proceedings of the 61st annual meeting of the association for computational linguistics. 2023. p. 574-80.

Comments (0)

No login
gif