Abstract:In the natural language generation task, topic text generation is a challenging task,the main difficulty is that the amount of source information is much smaller than the amount of information generated by the target. To solve this problem, this paper proposes a topic text generation model called Trans-K based on external knowledge filtering, which enriches the source information by introducing external knowledge related to topic words, thereby improving the quality of the generated text. In this paper, in order to solve the "polysemy" problem of introducing external knowledge, a topic vector calculation method based on linear transformation is proposed to filter external knowledge consistent with the semantics of the topic words. An external weight calculation method based on attention mechanism is proposed, which sets a topic weight for each external word to make it more suitable for text semantics. In order to solve the problem that topic words including candidate words, appear repeatedly in the generated text, an internal weight calculation method based on the multi-head attention mechanism is proposed. Experiments on the EASSY dataset show that Trans-K is superior to various indicators of the quality of generated text compared to the baseline. In addition, human evaluations show that the model can generate more topic-relevant, linguistically coherent, and semantically logical's text.