Contextualizing Language Understanding with Graph-based Knowledge Representations
Chen, Sanxing, Computer Science - School of Engineering and Applied Science, University of Virginia
Ji, Yangfeng, EN-Comp Science Dept, University of Virginia
Language understanding requires not only linguistic knowledge but also relies on knowledge that is external to textual symbols. A vast amount of knowledge is stored in the form of graph-structured data in many application domains. Despite a growing interest in developing knowledge-driven approaches in the community, how to build powerful representations of graph-structured knowledge and effectively incorporate them into language understanding models remains a challenging problem in natural language processing research.
This thesis explores the direction of contextualizing language understanding with graph-based knowledge representations. I first demonstrate the challenges of building meaningful interactions between language representations and domain-specific knowledge representations in the task of cross-domain Text-to-SQL semantic parsing. By citing this example, I point out the idea of fostering multiple connections between the two representations in their different levels of abstraction and utilize the idea to substantially improve two graph neural network-based semantic parsers. To implement this idea in a more general form to benefits more language understanding tasks, I propose a new knowledge graph representation model that shares a similar Transformer architecture design with prevalent language models. In the task of factoid question answering, I show that the proposed knowledge representations can be effectively integrated into state-of-the-art pre-trained language models via a simple cross-modality attention mechanism.
MS (Master of Science)
natural language processing, knowledge graph representation, semantic parsing
English
2020/12/02