Skip to content
Introduction to Large Language Models with GPT & LangChain
  • AI Chat
  • Code
  • Report
  • Introduction to Large Language Models with GPT & LangChain

    ChatGPT is wildly popular, with over a billion visits per month. Although this web interface is great for many non-technical use cases, for programming and automation tasks, it is better to access GPT (the AI that powers ChatGPT) via the OpenAI API.

    As well as GPT, you'll also make use of LangChain, a programming framework for working with generative AI.

    You'll cover:

    • Getting set up with an OpenAI developer account and integration with Workspace.
    • Calling the chat functionality in the OpenAI API, with and without langchain.
    • Simple prompt engineering.
    • Holding a conversation with GPT.
    • Ideas for incorporating GPT into a data analysis or data science workflow.

    You'll be using GPT to explore a dataset about electric cars in Washington state, USA.

    Before you begin

    You'll need a developer account with OpenAI.

    See getting-started.ipynb for steps on how to create an API key and store it in Workspace. In particular, you'll need to follow the instructions in the "Getting started with OpenAI" and "Setting up Workspace Integrations" sections.

    Task 0: Setup

    We need to install the langchain package. This is currently being developed quickly, sometimes with breaking changes, so we fix the version.

    The langchain depends on a recent version of typing_extensions, so we need to update that package, again fixing the version.

    Instructions

    Run the following code to install langchain and typing_extensions.

    # Install the langchain package
    !pip install langchain==0.0.300
    Hidden output
    # Update the typing_extensions package
    !pip install typing_extensions==4.8.0

    In order to chat with GPT, we need first need to load the openai and os packages to set the API key from the environment variables you just created.

    Instructions

    • Import the os package.
    • Import the openai package.
    • Set openai.api_key to the OPENAI_API_KEY environment variable.
    # Import the os package
    import os
    # Import the openai package
    import openai
    
    # Set openai.api_key to the OPENAI_API_KEY environment variable
    openai.api_key = os.getenv("OPENAI_API_KEY")

    We need to import the langchain package. It has many submodules, so to save typing later, we'll also import some specific functions from those submodules.

    Instructions

    • Import the langchain package as lc.
    • From the langchain.chat_models module, import ChatOpenAI.
    • From the langchain.schema module, import AIMessage, HumanMessage, SystemMessage.
    # Import the langchain package as lc
    import langchain as lc
    
    # From the langchain.chat_models module, import ChatOpenAI
    from langchain.chat_models import ChatOpenAI
    
    # From the langchain.schema module, import AIMessage, HumanMessage, SystemMessage
    from langchain.schema import AIMessage, HumanMessage, SystemMessage