Does your neural NLG system need discourse relations? Our PDTB experiments say so

Our new SIGDIAL-22 paper reports on broad coverage experiments with the Penn Discourse Treebank where we quantify and analyze the extent to which including discourse relations in the input to a pretrained neural language model helps to accurately generate discourse connectives conveying the intended meaning. Notably, we find that cognitive discourse processing heuristics can help explain the error patterns one finds when trying to predict discourse connectives without telling the model the intended relation!