π€This template showcases a LangChain.js retrieval chain and the Vercel AI SDK in a Next.js project.
π οΈThe agent has access to a vector store retriever as a tool as well as a memory. It's particularly well suited to meta-questions about the current conversation.
π»You can find the prompt and model logic for this use-case in app/api/chat/retrieval_agents/route.ts.
π€By default, the agent is pretending to be a robot, but you can change the prompt to whatever you want!
π¨The main frontend logic is found in app/retrieval_agents/page.tsx.
π±Before running this example, you'll first need to set up a Supabase (or other) vector store. See the README for more details.
πUpload some text, then try asking e.g. What are some ways of doing retrieval in LangChain? below!