Retrieval Augmented Generation with LlamaIndex
Large language models (LLMs) like Llama 2 are the must-have technology of the year. Unfortunately, LLMs can't accurately answer questions about your business because they lack enough domain knowledge. The solution is to combine the LLM with a vector database like Chroma—a technique known as retrieval augmented generation (RAG). Beyond this, incorporating AI into products is best done with an AI application framework, like LlamaIndex. In this session you'll learn how to get started with Chroma and perform Q&A on some documents using Llama 2, the RAG technique, and LlamaIndex.