Installation and Setup
To get started, we need to set up our environment. You can install the necessary libraries using the following command in your terminal:UPSTASH_VECTOR_REST_URL
and UPSTASH_VECTOR_REST_TOKEN
and paste them into our .env
file.
Environment Variables
Create a.env
file in your project directory and add the following content:
LLAMA_CLOUD_API_KEY
, you can follow the instructions in the LlamaCloud documentation.
Part 1: Parsing a Document
We can now move on to parsing a document. In this example, we’ll parse a file namedglobal_warming.txt
.
If you are using Jupyter Notebook, you need to allow nested event loops to parse the document.You can do this by adding the following code snippet to your file:
Part 2: Querying the Parsed Document with an LLM
In this part, we’ll use theUpstashVectorStore
to create an index, and query the content. We’ll use OpenAI as the language model to interpret the data and respond to questions based on the document. You can use other LLMs that are supported by LlamaIndex as well.