Integrations
Ollama
You can use Ollama as LLM provider for your local development in RAGChat. To use an Ollama model, first initialize RAGChat with the Ollama model:
import { RAGChat, ollama } from "@upstash/rag-chat";
export const ragChat = new RAGChat({
model: ollama("llama3.1"),
});
Was this page helpful?