course
Transformer Models with PyTorch
AvansatNivel de calificare
Actualizat 01.2025PyTorchArtificial Intelligence2 oră7 videos23 exercises1,900 XP6,446Declarație de realizare
Creează-ți contul gratuit
sau
Continuând, acceptați Termenii și condițiile de utilizare, Politica de confidențialitate și faptul că datele dvs. sunt stocate în SUA.Îndrăgit de cursanți din mii de companii
Instruirea a 2 sau mai multe persoane?
Încercați DataCamp for BusinessDescrierea cursului
Deep-Dive into the Transformer Architecture
Transformer models have revolutionized text modeling, kickstarting the generative AI boom by enabling today's large language models (LLMs). In this course, you'll look at the key components in this architecture, including positional encoding, attention mechanisms, and feed-forward sublayers. You'll code these components in a modular way to build your own transformer step-by-step.Implement Attention Mechanisms with PyTorch
The attention mechanism is a key development that helped formalize the transformer architecture. Self-attention allows transformers to better identify relationships between tokens, which improves the quality of generated text. Learn how to create a multi-head attention mechanism class that will form a key building block in your transformer models.Build Your Own Transformer Models
Learn to build encoder-only, decoder-only, and encoder-decoder transformer models. Learn how to choose and code these different transformer architectures for different language tasks, including text classification and sentiment analysis, text generation and completion, and sequence-to-sequence translation.Cerințe preliminare
Deep Learning for Text with PyTorch1
The Building Blocks of Transformer Models
Discover what makes the hottest deep learning architecture in AI tick! Learn about the components that make up Transformer models, including the famous self-attention mechanisms described in the renowned paper "Attention is All You Need."
2
Building Transformer Architectures
Design transformer encoder and decoder blocks, and combine them with positional encoding, multi-headed attention, and position-wise feed-forward networks to build your very own Transformer architectures. Along the way, you'll develop a deep understanding and appreciation for how transformers work under the hood.
Transformer Models with PyTorch
Curs finalizat
Obțineți o Declarație de Realizări
Adaugă aceste acreditări la profilul, CV-ul sau profilul tău LinkedInDistribuie-l pe rețelele sociale și în evaluarea performanței tale
Inclus cuPremium or Echipe
Înscrie-te AcumAlătură-te 19 milioane de cursanți și începe Transformer Models with PyTorch chiar azi!
Creează-ți contul gratuit
sau
Continuând, acceptați Termenii și condițiile de utilizare, Politica de confidențialitate și faptul că datele dvs. sunt stocate în SUA.