2023-08-09 11:49:22 -07:00
|
|
|
# LangChain
|
|
|
|
|
|
|
|
This example is a basic "hello world" of using LangChain with Ollama.
|
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
## Running the Example
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2024-07-30 08:56:37 +02:00
|
|
|
1. Ensure you have the `llama3.1` model installed:
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
```bash
|
2024-07-30 08:56:37 +02:00
|
|
|
ollama pull llama3.1
|
2023-12-22 09:10:41 -08:00
|
|
|
```
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
2. Install the Python Requirements.
|
2023-08-10 23:35:19 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
```bash
|
|
|
|
pip install -r requirements.txt
|
|
|
|
```
|
2023-08-10 23:35:19 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
3. Run the example:
|
|
|
|
|
|
|
|
```bash
|
|
|
|
python main.py
|
|
|
|
```
|