cfb1ddd6fc
- better formatting of input prompt - use invoke instead of predict |
||
---|---|---|
.. | ||
main.py | ||
README.md | ||
requirements.txt |
LangChain
This example is a basic "hello world" of using LangChain with Ollama.
Running the Example
-
Ensure you have the
llama3.2
model installed:ollama pull llama3.2
-
Install the Python Requirements.
pip install -r requirements.txt
-
Run the example:
python main.py