HomeUpcoming webinars

Build an LLM From Scratch

Key Takeaways:
  • Learn how to create an LLM from scratch in Python and PyTorch.
  • Understand how to design model architecture, build transformer blocks, and train effectively.
  • Gain insight into model evaluation, tuning, and post-training refinement.
Friday, November 21, 11 AM ET
View More Webinars

Register for the webinar

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Description

Large language models may feel like magic—but under the hood, they’re just sophisticated neural architectures. In this advanced, hands-on session, you’ll learn how to design, train, and evaluate your own LLM inspired by Andrej Karpathy’s nanoGPT, using Python and PyTorch. This is a rare opportunity to go beyond using pre-trained models and truly understand how they work.

In this code-along, Jacob Buckman, the CEO at Manifest AI, will walk you through the process—from constructing transformer blocks to setting up the training pipeline and evaluating model performance. You’ll gain a deeper appreciation of the architecture and training dynamics that power modern language models, and come away ready to experiment with your own custom builds.

Presenter Bio

Jacob Buckman Headshot
Jacob BuckmanCEO at Manifest AI

Jacob runs the AI research company, Manifest AI. He is co-creator of the power attention mechanism for long context LLMs, and an expert in deep and reinforcement learning. Previously he was a resident at the Google Brain project.

View More Webinars