BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension[2019 facebook ai] IntroductionModelArchitecturePre-training BARTFine-tuning BARTComparing Pre-training ObjectivesTasksResultsLarge-scale Pre-training experiementsDiscriminative TasksGeneration TasksSummarizationDialogueAbstractive QATranslationQualitative AnalysisReferenceIntroductio..