2023-08-09 11:49:22 -07:00
|
|
|
# LangChain Web Summarization
|
|
|
|
|
2024-02-09 15:19:30 -08:00
|
|
|
This example summarizes the website, [https://ollama.com/blog/run-llama2-uncensored-locally](https://ollama.com/blog/run-llama2-uncensored-locally)
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
## Running the Example
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2024-09-25 11:11:22 -07:00
|
|
|
1. Ensure you have the `llama3.2` model installed:
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
```bash
|
2024-09-25 11:11:22 -07:00
|
|
|
ollama pull llama3.2
|
2023-12-22 09:10:41 -08:00
|
|
|
```
|
2023-08-09 11:49:22 -07:00
|
|
|
|
2023-12-22 09:10:41 -08:00
|
|
|
2. Install the Python Requirements.
|
|
|
|
|
|
|
|
```bash
|
|
|
|
pip install -r requirements.txt
|
|
|
|
```
|
|
|
|
|
|
|
|
3. Run the example:
|
|
|
|
|
|
|
|
```bash
|
|
|
|
python main.py
|
|
|
|
```
|