Let’s take a second to do some 101.

Undoubtedly, you have interacted with many new “AI” features in your software lately. Perhaps Notion AI, Github CoPilot, or even just ChatGPT. These are all fundamentally powered by a Large Language Model (LLM), such as GPT.

However, they’re not powered by only the LLM. GPT is impressive but not that impressive. Usually, they’re combined with a few other AI-powered tools.

The most common example is a vector database, which helps you inject relevant context into your LLM prompt based on the query.

To connect these pieces, we create what are called “chains”. You might have heard of Langchain, a very popular open source project that brought this concept into the mainstream.

Full disclosure: I’m one of the founding team at Relevance AI. We provide an enterprise-ready, hosted chaining solution via SDK and Notebook. We are kinda like Vercel for AI chains & agents. In this tutorial, we’ll be using a Relevance AI chain to power the Q&A feature.

In Relevance AI chains are often referred as tools now.