Are you sure you want to delete this access key?
Code and samples from the paper "Language Models are Unsupervised Multitask Learners".
For now, we have only released a smaller (117M parameter) version of GPT-2.
See more details in our blog post.
This repository is meant to be a starting point for researchers and engineers to experiment with GPT-2-117M. While GPT-2-117M is less proficient than GPT-2-1.5B, it is useful for a wide range of research and applications which could also apply to larger models.
Please let us know if you’re doing interesting research with or working on applications of GPT-2-117M! We’re especially interested in hearing from and potentially working with those who are studying
See DEVELOPERS.md
See CONTRIBUTORS.md
WARNING: Samples are unfiltered and may contain offensive content. |
---|
While we have not yet released GPT-2 itself, you can see some samples from it in the gpt-2-samples
folder.
We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
We show conditional samples, with contexts drawn from WebText
's test set, with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
Please use the following bibtex entry:
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.
Press p or to see the previous file or, n or to see the next file
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?