Amazing progress on data efficiency

Our INLG-21 paper shows that combining constrained decoding with self-training and pre-trained models makes it possible to reduce data needs for a challenging compositional neural NLG dataset down to the hundreds — a level where crowdsourcing is no longer necessary!