Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Sebastian Raschka 57206a3090
Update README.md
3 months ago
1eaf48e75a
add libmamba
3 months ago
d85644dd2c
add readme files
6 months ago
491a598776
Update bpe_openai_gpt2.py
3 months ago
d3d2593bd7
<|endoftext|> token in dataset v1
3 months ago
847a647f14
update cover
3 months ago
4fe8832e82
remove OS temp files
5 months ago
1bba9c24e9
Update LICENSE.txt
4 months ago
57206a3090
Update README.md
3 months ago
Storage Buckets

README.md

You have to be logged in to leave a comment. Sign In

Build a Large Language Model (From Scratch)

(If you downloaded the code bundle from the Manning website, please consider visiting the official code repository on GitHub at https://github.com/rasbt/LLMs-from-scratch.)



In Build a Large Language Model (from Scratch), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.

The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT.



Table of Contents

Please note that the Readme.md file is a Markdown (.md) file. If you have downloaded this code bundle from the Manning website and are viewing it on your local computer, I recommend using a Markdown editor or previewer for proper viewing. If you haven't installed a Markdown editor yet, MarkText is a good free option.

Alternatively, you can view this and other files on GitHub at https://github.com/rasbt/LLMs-from-scratch.



Chapter Title Main Code (for quick access) All Code + Supplementary
Ch 1: Understanding Large Language Models No code No code
Ch 2: Working with Text Data - ch02.ipynb
- dataloader.ipynb (summary)
- exercise-solutions.ipynb
./ch02
Ch 3: Coding Attention Mechanisms - ch03.ipynb
- multihead-attention.ipynb (summary)
./ch03
Ch 4: Implementing a GPT Model from Scratch coming soon ...
Ch 5: Pretraining on Unlabeled Data Q1 2024 ...
Ch 6: Finetuning for Text Classification Q2 2024 ...
Ch 7: Finetuning with Human Feedback Q2 2024 ...
Ch 8: Using Large Language Models in Practice Q2/3 2024 ...
Appendix A: Introduction to PyTorch* - code-part1.ipynb
- code-part2.ipynb
- DDP-script.py
- exercise-solutions.ipynb
./appendix-A

(* Please see this and this folder if you need more guidance on installing Python and Python packages.)



(A mental model summarizing the contents covered in this book.)

Tip!

Press p or to see the previous file or, n or to see the next file

About

Implementing a ChatGPT-like LLM from scratch, step by step

Collaborators 1

Comments

Loading...