A Novel GCN Architecture for Text Generation from Knowledge Graphs: Full Node Embedded Strategy and Context Gate with Copy and Penalty Mechanism

Tracking #: 2654-3868

This paper is currently under review
Zhongqiang Hu
Weiwen Zhang
Depei Wang
Weicai Niu
Fei Mo
Jianwen Ma
Guoheng Huang
Lianglun Cheng

Responsible editor: 
Philipp Cimiano

Submission type: 
Full Paper
Text generation from knowledge graphs is a fundamental task, which aims to map triplets to description text. Previous research mostly adopts standard sequence-to-sequence methods, which would inevitably fail to capture graph structure information. In this paper, we propose a novel neural network architecture called GCN-FCCP, which is based on Graph Convolutional Network enabled by a Full node embedded strategy and Context gates with Copy and Penalty mechanism. The full node embedded strategy embeds each word in the input triplets as a new node to enhance the graph information, while a stacked multi-layer graph convolutional network is used as the encoder to directly exploit the input structure. For the decoder, we integrate a context gate into the LSTM network to retain the information of contexts during the hidden state updating process, which ensures the faithfulness to the original meaning. Meanwhile, we add copy attention and penalty mechanism to the decoder to solve the Out-of-vocabulary (OOV) problem and improve the quality of the generated sentences. Extensive experiments on the WebNLG dataset show that GCN-FCCP can effectively generate high-quality text from graph-structured input, which obtains high scores in four automatic metrics.
Full PDF Version: 
Under Review