Discourse relations still matter for neural NLG

Our INLG-21 paper revisits our experiments with reimplementing a classic rule-based NLG system with a neural model, finding that representing discourse relations remains essential for best performance in low-data settings even when using pre-trained models.