Skip to main content
HomePython

Course

Introduction to LLMs in Python

IntermediateSkill Level
4.7+
1,566 reviews
Updated 01/2026
Learn the nuts and bolts of LLMs and the revolutionary transformer architecture they are based on!
Start Course for Free
PythonArtificial Intelligence
3 hr
11 videos
34 Exercises
2,700 XP
32,199
Statement of Accomplishment

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Loved by learners at thousands of companies

Group

Training a Team?

Try for Business

Course Description

Uncover What's Behind the Large Language Models Hype



Large Language Models (LLMs) have become pivotal tools driving some of the most stunning advancements and applications in today's AI landscape. This hands-on course will equip you with the practical knowledge and skills needed to understand, build, and harness the power of LLMs for solving complex language tasks such as translation, language generation, and more.

Discover LLM Architecture and Leverage Pre-Trained Models



Through interactive coding exercises, you'll discover different transformer architectures and how to identify them. You'll explore leveraging pre-trained language models and datasets from Hugging Face for fine-tuning and evaluating your model using advanced metrics that fit LLMs. Finally, you'll find out more about ethical and bias concerns relevant to LLMs and ways to identify these. By the end of this course, you will be able to build LLMs, fine-tune, and evaluate them using specialized metrics while understanding the key challenges and ethical considerations of enabling real-world LLM applications.

What you'll learn

  • Define the end-to-end workflow for fine-tuning a pre-trained LLM with Hugging Face
  • Differentiate practical techniques to mitigate bias, hallucination, and other ethical risks when deploying LLMs
  • Distinguish between encoder-only, decoder-only, and encoder-decoder transformer architectures
  • Evaluate model performance by selecting and interpreting appropriate metrics such as accuracy, BLEU, ROUGE, perplexity, and toxicity
  • Identify the primary language tasks large language models can perform and the Hugging Face tools used to run them

Feels like what you want to learn?

Start Course for Free

Prerequisites

Working with Hugging Face
1

Getting Started with Large Language Models (LLMs)

Begin your journey with Large Language Models (LLMs) by understanding what they are and what they can do, and peek under the hood to get an idea of how they work!
Start Chapter
2

Fine-tuning LLMs

Learn how to leverage pre-trained LLMs and datasets from Hugging Face to fine-tune a model.
Start Chapter
Introduction to LLMs in Python
Course
Complete

Earn Statement of Accomplishment

Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
Enroll Now

Don’t just take our word for it

*4.7
from 1,566 reviews
80%
18%
2%
0%
0%
  • Safwt
    2 hours ago

  • Edisson Ernesto
    9 hours ago

  • Marwan
    16 hours ago

  • Shiva
    yesterday

  • Haibin
    yesterday

  • Makrious
    3 days ago

Edisson Ernesto

Shiva

Haibin

FAQs

Do I need prior knowledge of LLMs or transformers?

No, you don't need much prior knowledge on LLMs or transformers. This course will need you to be familiar with navigating the Hugging Face Hub and deep learning models. It will discuss transformer architecture at a high-level to help you understand the different structures available.

What tools will I use during the course?

You will use Hugging Face’s pre-trained models and datasets, along with Python, to work on practical coding exercises. You won't be using any APIs.

Join over 19 million learners and start Introduction to LLMs in Python today!

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Grow your data skills with DataCamp for Mobile

Make progress on the go with our mobile courses and daily 5-minute coding challenges.