From 5d3f7fff26dc033cfa5659104b069338a4ad8695 Mon Sep 17 00:00:00 2001 From: boessu Date: Wed, 8 May 2024 01:36:34 +0200 Subject: [PATCH] Update langchainpy.md (#4236) fixing pip code. --- docs/tutorials/langchainpy.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorials/langchainpy.md b/docs/tutorials/langchainpy.md index b0235679..9a1bca0d 100644 --- a/docs/tutorials/langchainpy.md +++ b/docs/tutorials/langchainpy.md @@ -12,7 +12,7 @@ So let's figure out how we can use **LangChain** with Ollama to ask our question Let's start by asking a simple question that we can get an answer to from the **Llama2** model using **Ollama**. First, we need to install the **LangChain** package: -`pip install langchain` +`pip install langchain_community` Then we can create a model and ask the question: