[TOC]
Abs
leveraging knowledge graph embeddings to improve the effectiveness of PEFT:It inserts an adaptation layer into an LLM to integrate the embeddings of entities ap- pearing in the input text. The adaptation layer is trained in combination with LoRA on instruc- tion data.
结果:KnowLA can help ac- tivate the relevant parameterized knowledge in an LLM to answer a question without changing its parameters or input prompts.
KnowLA
idea:把text中出现的entity的embedding嵌入到peft层上
steps:
(1) Entity linking, which links the tokens in a question to entities in the KG.
(2) Knowledge mapping and injection, which maps the KG embedding space to the LLM’s representation space and infuses the entity embeddings corresponding to a specific token in the question
(3) Knowledge fusion, which integrates each token represen- tation with its entity embedding.
Entity Linking
目标是把text中的重要token和entity链接起来:We use the text-rank algorithm to recognize important tokens and link the recognized tokens to the KG by string matching.
但是具体实现尚不明确,需要review code:
Knowledge Mapping and Injection
做两件事:
- 计算一个hidden_states平均表示:
- 将KG空间转为text空间(而且text空间更大,表达能力更强)