Are you sure you want to delete this access key?
Status: Archive (code is provided as-is, no updates expected)
Code from the paper "Language Models are Unsupervised Multitask Learners".
We have currently released small (117M parameter) and medium (345M parameter) versions of GPT-2. While we have not released the larger models, we have released a dataset for researchers to study their behaviors.
See more details in our blog post.
This repository is meant to be a starting point for researchers and engineers to experiment with GPT-2.
For basic information, see our model card.
Please let us know if you’re doing interesting research with or working on applications of GPT-2! We’re especially interested in hearing from and potentially working with those who are studying
See DEVELOPERS.md
See CONTRIBUTORS.md
Please use the following bibtex entry:
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.
Press p or to see the previous file or, n or to see the next file
This is the DAGsHub mirror of GPT-2 made by OpenAI.
Code for the paper "Language Models are Unsupervised Multitask Learners"
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?