2023-08-09 18:49:22 +00:00
|
|
|
# LangChain Web Summarization
|
|
|
|
|
2024-02-09 23:19:30 +00:00
|
|
|
This example summarizes the website, [https://ollama.com/blog/run-llama2-uncensored-locally](https://ollama.com/blog/run-llama2-uncensored-locally)
|
2023-08-09 18:49:22 +00:00
|
|
|
|
2023-12-22 17:10:41 +00:00
|
|
|
## Running the Example
|
2023-08-09 18:49:22 +00:00
|
|
|
|
2024-07-30 06:56:37 +00:00
|
|
|
1. Ensure you have the `llama3.1` model installed:
|
2023-08-09 18:49:22 +00:00
|
|
|
|
2023-12-22 17:10:41 +00:00
|
|
|
```bash
|
2024-07-30 06:56:37 +00:00
|
|
|
ollama pull llama3.1
|
2023-12-22 17:10:41 +00:00
|
|
|
```
|
2023-08-09 18:49:22 +00:00
|
|
|
|
2023-12-22 17:10:41 +00:00
|
|
|
2. Install the Python Requirements.
|
|
|
|
|
|
|
|
```bash
|
|
|
|
pip install -r requirements.txt
|
|
|
|
```
|
|
|
|
|
|
|
|
3. Run the example:
|
|
|
|
|
|
|
|
```bash
|
|
|
|
python main.py
|
|
|
|
```
|