Knowledge Graph Embedding Space
(Redirected from Embedded KG)
Jump to navigation
Jump to search
A Knowledge Graph Embedding Space is a graph embedding space for a knowledge graph.
- Context:
- It can range from being a Structured KG Embedding (e.g. [[semi-structured KG]) to being an Embedded KG.
- It can range from being a Small KG Embedding to being a Medium-Sized KG Embedding to being a Large KG Embedding.
- It can be produced by a Embedded KG Construction Task.
- …
- Example(s):
- ...
- …
- Counter-Example(s):
- a Structured KG, such as Freebase KG, LinkedIn's KG, Google KG.
- See: Lightweight Ontology, Graph Embedding, Ontology Embedding.
References
2023
- (Gema et al., 2023) ⇒ Aryo Pradipta Gema, Dominik Grabarczyk, Wolf De Wulf, Piyush Borole, Javier Antonio Alfaro, Pasquale Minervini, Antonio Vergari, and Ajitha Rajan. (2023). “Knowledge Graph Embeddings in the Biomedical Domain: Are They Useful? A Look at Link Prediction, Rule Learning, and Downstream Polypharmacy Tasks.” doi:10.48550/arXiv.2305.19979
2023
- chat
- Q: What is a neural knowledge graph embedding?
- A: A neural knowledge graph embedding is a method that represents entities and relationships in a knowledge graph as numerical vectors (embeddings) in a high-dimensional space. These embeddings are learned using neural networks and capture the structure and meaning of the relationships in the graph. This representation enables various tasks such as link prediction and entity classification to be performed in a computationally efficient manner.
2023
- (Suri et al., 2023) ⇒ Kunal Suri, Atul Singh, Prakhar Mishra, Swapna Sourav Rout, and Rajesh Sabapathy. (2023). “Language Models Sounds the Death Knell of Knowledge Graphs.” arXiv preprint arXiv:2301.03980
- QUOTE: ... Healthcare has been dealing with an explosion in information about different types of drugs, diseases, and procedures. This paper argues that using Knowledge Graphs is not the best solution for solving problems in this domain. We present experiments using LLMs for the healthcare domain to demonstrate that language models provide the same functionality as knowledge graphs, thereby making knowledge graphs redundant.
2017
- (Wang et al., 2017) ⇒ Quan Wang, Zhendong Mao, Bin Wang, and Li Guo. (2017). “Knowledge Graph Embedding: A Survey of Approaches and Applications.” IEEE Transactions on Knowledge and Data Engineering 29, no. 12
- ABSTRACT: Knowledge graph (KG) embedding is to embed components of a KG including entities and relations into continuous vector spaces, so as to simplify the manipulation while preserving the inherent structure of the KG. It can benefit a variety of downstream tasks such as KG completion and relation extraction, and hence has quickly gained massive attention. In this article, we provide a systematic review of existing techniques, including not only the state-of-the-arts but also those with latest trends. Particularly, we make the review based on the type of information used in the embedding task. Techniques that conduct embedding using only facts observed in the KG are first introduced. We describe the overall framework, specific model design, typical training procedures, as well as pros and cons of such techniques. After that, we discuss techniques that further incorporate additional information besides facts. We focus specifically on the use of entity types, relation paths, textual descriptions, and logical rules. Finally, we briefly introduce how KG embedding can be applied to and benefit a wide variety of downstream tasks such as KG completion, relation extraction, question answering, and so forth.