Recurrent Neural Networks (RNN) for Language Modeling in Python
Learn how to use RNNs to classify text sentiment, generate sentences, and translate text between languages.Start Course for Free
4 Hours16 Videos54 Exercises11,609 Learners4500 XP
Create Your Free Account
Loved by learners at thousands of companies
Learn How to Use RNN Modeling in PythonIn this course, you will learn how to use Recurrent Neural Networks to classify text (binary and multiclass), generate phrases, and translate Portuguese sentences into English.
Machine Learning models are based on numerical values to make predictions and classifications, but how can computers deal with text data? With the huge increase of available text data, applications such as automatic document classification, text generation, and neural machine translation are possible. Here, you’ll learn how RNNs in machine learning can help with this process.
Discover the Power of Recurrent Neural NetworksYou’ll start this four-hour course by looking at the foundations of Recurrent Neural Networks. Exploring how information flows through a recurrent neural network, you’ll use a Keras RNN model to perform sentiment classification.
As you review RNN architecture in more detail, you’ll learn about vanishing and exploding gradient problems and how to embed layers in a language model.
Explore Language Models With Real-Life DataBuilding on this knowledge, you’ll discover how you can prepare data for a multi-class classification task, exploring how these tasks differ from binary classification.
Finally, you’ll learn how to use RNN models for text generation and neural machine translation. You’ll use your knowledge of recurrent neural networks to replicate the speech of Sheldon from The Big Bang Theory and to translate Portuguese phrases into English.
This course provides an in-depth look at RNNs in machine learning, giving you the knowledge to build your skills in this area.
Recurrent Neural Networks and KerasFree
In this chapter, you will learn the foundations of Recurrent Neural Networks (RNN). Starting with some prerequisites, continuing to understanding how information flows through the network and finally seeing how to implement such models with Keras in the sentiment classification task.Keras models: Sequential50 xpIntroduction to the course50 xpKeras models: Model50 xpComparing the number of parameter of RNN and ANN50 xpSentiment analysis100 xpSequence to sequence models50 xpIntroduction to language models50 xpGetting used to text data100 xpPreparing text data for model input100 xpTransforming new text100 xpIntroduction to RNN inside Keras50 xpKeras models100 xpKeras preprocessing100 xpYour first RNN model100 xp
You will learn about the vanishing and exploding gradient problems, often occurring in RNNs, and how to deal with them with the GRU and LSTM cells. Furthermore, you'll create embedding layers for language models and revisit the sentiment classification task.Vanishing and exploding gradients50 xpExploding gradient problem100 xpVanishing gradient problem100 xpGRU and LSTM cells50 xpGRU cells are better than simpleRNN100 xpStacking RNN layers100 xpThe Embedding layer50 xpNumber of parameters comparison100 xpTransfer learning100 xpEmbeddings improves performance100 xpSentiment classification revisited50 xpBetter sentiment classification100 xpUsing the CNN layer100 xp
Next, in this chapter you will learn how to prepare data for the multi-class classification task, as well as the differences between multi-class classification and binary classification (sentiment analysis). Finally, you will learn how to create models and measure their performance with Keras.Data pre-processing50 xpPrepare label vectors100 xpPre-process data100 xpTransfer learning for language models50 xpTransfer learning starting point100 xpWord2Vec100 xpMulti-class classification models50 xpExploring 20 News Groups dataset100 xpClassifying news articles100 xpAssessing the model's performance50 xpPrecision-Recall trade-off100 xpPrecision or Recall, that is the question100 xpPerformance on multi-class classification100 xp
Sequence to Sequence Models
This chapter introduces you to two applications of RNN models: Text Generation and Neural Machine Translation. You will learn how to prepare the text data to the format needed by the models. The Text Generation model is used for replicating a character's way of speech and will have some fun mimicking Sheldon from The Big Bang Theory. Neural Machine Translation is used for example by Google Translate in a much more complex model. In this chapter, you will create a model that translates Portuguese small phrases into English.Sequence to Sequence Models50 xpText generation examples100 xpNMT example100 xpThe Text Generating Function50 xpPredict next character100 xpGenerate sentence with context100 xpChange the probability scale100 xpText Generation Models50 xpCreate vectors of sentences and next characters100 xpPreparing the data for training100 xpCreating the text generation model100 xpNeural Machine Translation50 xpPreparing the input text100 xpPreparing the output text100 xpTranslate Portuguese to English100 xpCongratulations!50 xp
DatasetsScripts of the TV show The Big Bang Theory.Sample of small sentences in English and Portuguese
I am a Data Scientist focusing my work and research on using machine learning on text data. I entered the field when I co-founded a startup company in the field of RegTech that automatically collects, classifies, and distributes regulations on highly regulated markets. I currently work at John Snow Labs as a Sr Data Scientist where I do consulting projects and help maintaining the NLP libraries of the company specialized for Finance, Legal, and Healthcare domains.
What do other learners have to say?
Join over 11 million learners and start Recurrent Neural Networks (RNN) for Language Modeling in Python today!
Create Your Free Account