Skip to main content

Natural Language Generation in Python

Imitate Shakespear, translate language and autocomplete sentences using Deep Learning in Python.

Start Course for Free
4 Hours13 Videos52 Exercises3,529 Learners4550 XPDeep Learning for NLP Track

Create Your Free Account



By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA. You confirm you are at least 16 years old (13 if you are an authorized Classrooms user).

Loved by learners at thousands of companies

Course Description

Have you ever wondered how Gmail autocompletes your sentences, or, what powers the WhatsApp suggestions when you’re typing a message? The technology behind these helpful writing hints is machine learning. In this course, you'll build and train machine learning models for different natural language generation tasks. For example, you'll train a model on the literary works of Shakespeare and generate text in the style of his writing. You'll also learn how to create a neural translation model to translate English sentences into French. Finally, you'll train a seq2seq model to generate your own natural language autocomplete sentences, just like Gmail!

  1. 1

    Introduction to sequential data


    The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.

    Play Chapter Now
    Handling sequential data
    50 xp
    Preprocess names dataset
    100 xp
    Preprocessing names dataset (cont'd)
    100 xp
    Introduction to recurrent neural network
    50 xp
    Create input and target tensors
    100 xp
    Initialize input and target vectors with values
    100 xp
    Build and compile RNN network
    100 xp
    Inference using recurrent neural network
    50 xp
    Train RNN model and start predictions
    100 xp
    Generate baby names
    100 xp
  2. 2

    Write like Shakespeare

    In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.

    Play Chapter Now

In the following tracks

Deep Learning for NLP


AAN94Adel Nehmelis-sulmontLis Sulmont
Biswanath Halder Headshot

Biswanath Halder

Data Scientist

Biswanath is a Data Scientist having around nine years of working experience in companies like Oracle, Microsoft, and Adobe. He specializes in applying Machine Learning and Deep Learning techniques to complex business applications related to computer vision and natural language processing. He is a freelance educator and teaches Statistics, Mathematics, and Machine Learning. He holds a Master's degree in Computer Science from the Indian Institute of Science, Bangalore.
See More

What do other learners have to say?

I've used other sites—Coursera, Udacity, things like that—but DataCamp's been the one that I've stuck with.

Devon Edwards Joseph
Lloyds Banking Group

DataCamp is the top resource I recommend for learning data science.

Louis Maiden
Harvard Business School

DataCamp is by far my favorite website to learn from.

Ronald Bowers
Decision Science Analytics, USAA