Title: Structured Minimally Supervised Learning for Neural Relation Extraction
Abstract: In this talk, I will describe an effort to extract structured knowledge from text, without relying on slow and expensive human labeling (accepted to NAACL 2019). Our approach combines the benefits of learned representations and structured learning, and accurately predicts sentence-level relation mentions given only proposition-level supervision from a knowledge base. By explicitly reasoning about missing data during learning, this method enables large-scale training of convolutional neural networks while mitigating the issue of label noise inherent in distant supervision. Our approach achieves state-of-the-art results on minimally supervised sentential relation extraction, outperforming a number of baselines, including a competitive approach that uses the attention layer of a purely neural model.
Bio: Fan Bai is a third-year PhD student in the Department of Computer Science and Engineering, advised by Prof. Alan Ritter. His research mainly focuses on extracting structured knowledge from large corpus under distant supervision.