Language Generation with Multi-hop Reasoning on Commonsense Knowledge Graph - Despite the success of generative pre-trained language models on a series of text gen. tasks they still suffer in cases where reasoning over underlying commonsense knowledge is req during generation.
Existing approaches that integrate commonsense knowledge into generative pre-trained language models simply transfer relational knowledge by post-training on individual knowledge triples while ignoring rich connections within the knowledge graph.
Multi-Hop Reasoning Flow (GRF) that enables pre-trained models with dynamic multi-hop reasoning on multi-relational paths extracted from the external commonsense knowledge graph