Your Cart

LLM Book 4 - GPT, GPT-2, and GPT-3/ Generative Pre-trained Transformers for Text Generation for Entry-level

On Sale
$5.19
Pay what you want: (minimum $5.19)
$
Added to cart

Table of Contents

1. Introduction

1.1 What is Text Generation?

1.2 What are Generative Pre-trained Transformers?

1.3 Why are they important?

1.4 What are the challenges and limitations?

1.5 How to use this book?

2. GPT: The First Generative Pre-trained Transformer

2.1 The Architecture of GPT

2.2 The Training Process of GPT

2.3 The Evaluation of GPT

2.4 The Applications of GPT

2.5 The Code Examples of GPT

3. GPT-2: The Improved Generative Pre-trained Transformer

3.1 The Improvements of GPT-2 over GPT

3.2 The Training Process of GPT-2

3.3 The Evaluation of GPT-2

3.4 The Applications of GPT-2

3.5 The Code Examples of GPT-2

4. GPT-3: The State-of-the-art Generative Pre-trained Transformer 4.1 The Improvements of GPT-3 over GPT-2

4.2 The Training Process of GPT-3

4.3 The Evaluation of GPT-3

4.4 The Applications of GPT-3

4.5 The Code Examples of GPT-3

5. Conclusion

5.1 The Summary of the Book

5.2 The Future Directions of Generative Pre-trained Transformers 5.3 The Resources and References 

You will get a PDF (407KB) file