Loved by learners at thousands of companies
Have you ever wondered how Gmail autocompletes your sentences, or, what powers the WhatsApp suggestions when you’re typing a message? The technology behind these helpful writing hints is machine learning. In this course, you'll build and train machine learning models for different natural language generation tasks. For example, you'll train a model on the literary works of Shakespeare and generate text in the style of his writing. You'll also learn how to create a neural translation model to translate English sentences into French. Finally, you'll train a seq2seq model to generate your own natural language autocomplete sentences, just like Gmail!
Introduction to Sequential DataFree
The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.Handling sequential data50 xpPreprocess names dataset100 xpPreprocessing names dataset (cont'd)100 xpIntroduction to recurrent neural network50 xpCreate input and target tensors100 xpInitialize input and target vectors with values100 xpBuild and compile RNN network100 xpInference using recurrent neural network50 xpTrain RNN model and start predictions100 xpGenerate baby names100 xp
Write Like Shakespeare
In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.Limitations of recurrent neural networks50 xpVanishing and exploding gradients100 xpSimple network using Keras100 xpVanishing gradients100 xpIntroduction to long short term memory50 xpVocabulary and character to integer mapping100 xpInput and target dataset100 xpCreate and initialize the input and target vectors100 xpCreate LSTM model in keras100 xpInference using long short term memory50 xpTrain LSTM model100 xpPredict next character given a sequence100 xpGenerate text imitating Shakespeare100 xp
Translate Words into a Different Language
In this chapter, you'll learn about the encoder-decoder architecture and how it can be used to model sequence-to-sequence datasets, converting information from one domain to another domain. You'll use this knowledge to build a model for neural machine translation, training your model to translate English sentences into French.Introduction to sequence to sequence models50 xpCreate the eng-fra the dataset100 xpGetting the vocabularies100 xpMapping characters to integers and vice-versa100 xpNeural machine translation50 xpDefine the input and target vectors100 xpInitialize the input and target vectors100 xpBuilding the encoder and the decoder100 xpTrain the encoder decoder network100 xpInference model for encoder and decoder50 xpBuild inference models for encoder and decoder100 xpPredict the first character100 xpPredict the second character100 xpGenerate a fully translated sentence100 xp
Autocomplete Your Sentences
In this chapter, you'll build your very own machine learning seq2seq model. You'll use real-world messages from the Enron email dataset to train an encoder-decoder model. Using this you’ll predict the correct ending for an incomplete input sentence.Convert email data to seq2seq50 xpDivide the sentences into prefixes and suffixes100 xpCreate the vocabulary and the mappings100 xpDefine the input and target vectors100 xpInitialize the input and target vectors100 xpSentence autocompletion using Encoder-Decoder50 xpBuilding the encoder100 xpBuilding the decoder100 xpTrain the encoder and decoder100 xpAutocomplete sentences using inference models50 xpBuilding the inference models100 xpPredict the first character using inference models100 xpPredict the second character100 xpAutocomplete sentences100 xpCongratulations50 xp
Biswanath is a Data Scientist having around nine years of working experience in companies like Oracle, Microsoft, and Adobe. He specializes in applying Machine Learning and Deep Learning techniques to complex business applications related to computer vision and natural language processing. He is a freelance educator and teaches Statistics, Mathematics, and Machine Learning. He holds a Master's degree in Computer Science from the Indian Institute of Science, Bangalore.