2023-08-10 02:49:22 +08:00
|
|
|
# LangChain Web Summarization
|
|
|
|
|
|
2024-02-10 07:19:30 +08:00
|
|
|
This example summarizes the website, [https://ollama.com/blog/run-llama2-uncensored-locally](https://ollama.com/blog/run-llama2-uncensored-locally)
|
2023-08-10 02:49:22 +08:00
|
|
|
|
2023-12-23 01:10:41 +08:00
|
|
|
## Running the Example
|
2023-08-10 02:49:22 +08:00
|
|
|
|
2024-09-26 02:11:22 +08:00
|
|
|
1. Ensure you have the `llama3.2` model installed:
|
2023-08-10 02:49:22 +08:00
|
|
|
|
2023-12-23 01:10:41 +08:00
|
|
|
```bash
|
2024-09-26 02:11:22 +08:00
|
|
|
ollama pull llama3.2
|
2023-12-23 01:10:41 +08:00
|
|
|
```
|
2023-08-10 02:49:22 +08:00
|
|
|
|
2023-12-23 01:10:41 +08:00
|
|
|
2. Install the Python Requirements.
|
|
|
|
|
|
|
|
|
|
```bash
|
|
|
|
|
pip install -r requirements.txt
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
3. Run the example:
|
|
|
|
|
|
|
|
|
|
```bash
|
|
|
|
|
python main.py
|
|
|
|
|
```
|