https://www.jnlp.org/
(広告募集)
Pretraining with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence modelsの略称。Gap Sentences Generation(GSG)という新たな学習タスク以外はBERTと同様。