Generative Pre-trained Transformer 2 is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on a dataset of 8 million web pages. From Wikipedia