Knowledge Graph Question Answering based on Contrastive Learning and Feature Transformation
Proceedings - 2022 IEEE 22nd International Conference on Software Quality, Reliability and Security Companion, QRS-C 2022
Traditional Knowledge Graph Question Answering(KGQA) usually focuses on entity recognition and relation detection. Common relation detection methods cannot detect new relations without corresponding word entries in the system, and the propagation of errors leads to the loss of some semantic similarity information. In this paper, we propose an end-to-end knowledge graph question-answering framework (TransCL). Latent knowledge is first mined from the knowledge base and augmented information is generated in the form of question-answer pairs. Positive features are then transformed into difficult positive features using a feature transformation method based on positive extrapolation. We use contrastive learning methods to aggregate vectors and retain the original information, capturing deep matching features between data samples by contrast. TransCL is more capable of fuzzy matching and dealing with unknown inputs. Experiments show that our method achieves an F1 score of 85.50% on the NLPCC-ICCPOL-2016 open domain QA dataset.
Open Access Status
This publication is not available as open access