Are you sure you want to delete this access key?
Code and samples from the paper "Language Models are Unsupervised Multitask Learners".
For now, we have only released a smaller (117M parameter) version of GPT-2.
See more details in our blog post.
Download the model data
sh download_model.sh 117M
Install python packages:
pip3 install -r requirements.txt
WARNING: Samples are unfiltered and may contain offensive content. |
---|
To generate unconditional samples from the small model:
python3 src/generate_unconditional_samples.py | tee samples
There are various flags for controlling the samples:
python3 src/generate_unconditional_samples.py --top_k 40 --temperature 0.7 | tee samples
To give the model custom prompts, you can use:
python3 src/interactive_conditional_samples.py --top_k 40
While we have not yet released GPT-2 itself, you can see some samples from it in the gpt-2-samples
folder.
We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.
We may release code for evaluating the models on various benchmarks.
We are still considering release of the larger models.
Coming soon!
Press p or to see the previous file or, n or to see the next file
This is the DAGsHub mirror of GPT-2 made by OpenAI.
Code for the paper "Language Models are Unsupervised Multitask Learners"
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?
Are you sure you want to delete this access key?