RDF verbalization has received increasing interest, which aims to generate a natural language description of the knowledge base. Sequence-to-sequence models based on Transformer are able obtain strong performance equipped with pre-trained such as BART and T5. However, in spite general gain introduced by models, task is still limited small scale training dataset. To address problem, we propose t...