Loved by learners at thousands of companies
The need to pack a bilingual dictionary for your European holiday or keeping one on your desk to complete your foreign language homework is a thing of the past. You just hop on the internet and make use of a language translation service to quickly understand what the street sign means or finding out how to greet and thank a foreigner in their language. Behind the language translation services are complex machine translation models. Have you ever wondered how these models work? This course will allow you to explore the inner workings of a machine translation model. You will use Keras, a powerful Python-based deep learning library, to implement a translation model. You will then train the model to perform an English to French translation, and you will be shown techniques to improve your model. At the end of this course, you would have developed an in-depth understanding of machine translation models and appreciate them even more!
Introduction to machine translationFree
In this chapter, you'll understand what the encoder-decoder architecture is and how it is used for machine translation. You will also learn about Gated Recurrent Units (GRUs) and how they are used in the encoder-decoder architecture.Introduction to machine translation50 xpUnderstanding one-hot vectors100 xpPart 1: Exploring the to_categorical() function100 xpPart 2: Exploring the to_categorical() function100 xpEncoder decoder architecture50 xpPart 1: Text reversing model - Encoder100 xpPart 2: Text reversing model - Encoder100 xpComplete text reversing model100 xpUnderstanding sequential models50 xpPart 1: Understanding GRU models100 xpPart 2: Understanding GRU models100 xpUnderstanding sequential model output100 xp
Implementing an encoder decoder model with Keras
In this chapter, you will implement the encoder-decoder model with the Keras functional API. While doing so, you will learn several useful Keras layers such as RepeatVector and TimeDistributed layers.Implementing the encoder50 xpPart 1: Exploring the dataset100 xpPart 2: Exploring the dataset100 xpDefining the encoder100 xpImplementing the decoder50 xpUnderstanding the RepeatVector layer100 xpThe shape of a RepeatVector layer output50 xpDefining the decoder100 xpDense and TimeDistributed layers50 xpPart 1: Enter to win amazing prizes100 xpPart 2: Let's play a few more games100 xpImplementing the full encoder decoder model50 xpPart 1: Defining the full model100 xpPart 2: Defining the full model100 xp
Training and generating translations
In this chapter, you will train the previously defined model and then use a well-trained model to generate translations. You will see that our model does a good job when translating sentences.Part 1: Preprocessing the Data50 xpTokenizing sentences with Keras100 xpControlling the vocabulary with the Tokenizer100 xpPart 2: Preprocessing the Data50 xpAdding special tokens100 xpPadding sentences100 xpReversing sentences100 xpTraining the NMT model50 xpTraining the model100 xpSplitting data to training and validation sets100 xpTraining the model with validation100 xpGenerating translations with the NMT50 xpPart 1: Treasure hunt100 xpPart 2: Treasure hunt100 xpGenerating English-French translations100 xp
Teacher Forcing and word embeddings
In this chapter, you will learn about a technique known as Teacher Forcing, which enables translation models to be trained better and faster. Then you will learn how you can use word embeddings to make the model even better.Introduction to Teacher Forcing50 xpDefining the Teacher Forcing model layers100 xpDefining the Teacher Forcing model100 xpPreprocessing data100 xpTraining the model with Teacher Forcing50 xpTraining the model100 xpSplitting training and validation data100 xpTraining the model with validation100 xpGenerating translations from the model50 xpDefining the decoder of the inference model100 xpLink between the trained and inference model100 xpGenerating translations100 xpUsing word embedding for machine translation50 xpMeasuring word vector similarity100 xpDefining the embedding model100 xpTraining the word embedding based model100 xpWrap-up and the final showdown50 xp
In the following tracksDeep Learning for NLP
PrerequisitesIntroduction to Deep Learning in Python
Data Scientist and Author
Thushan Ganegedara is a Senior Data Scientist. He is the author of TF2 in Action - Manning and NLP with TensorFlow (v1.6). He has over 4 years experience with TensorFlow. Thushan likes to wear many hats as a YouTuber, blogger, presenter and a StackOverflow contributor. Deep learning and machine learning stand out as his passions. Unless he's dwelling in latest ML research you can find him meditating or swimming (not at the same time). Follow him on LinkedIn.
What do other learners have to say?
I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.
Devon Edwards Joseph
Lloyds Banking Group
DataCamp is the top resource I recommend for learning data science.
Harvard Business School
DataCamp is by far my favorite website to learn from.
Decision Science Analytics, USAA