diff --git a/docs/tutorials/langchainpy.md b/docs/tutorials/langchainpy.md index b0235679..9a1bca0d 100644 --- a/docs/tutorials/langchainpy.md +++ b/docs/tutorials/langchainpy.md @@ -12,7 +12,7 @@ So let's figure out how we can use **LangChain** with Ollama to ask our question Let's start by asking a simple question that we can get an answer to from the **Llama2** model using **Ollama**. First, we need to install the **LangChain** package: -`pip install langchain` +`pip install langchain_community` Then we can create a model and ask the question: