Tweaks to README.md
(#906)
* Mentioned Docker Hub in docs * Consolidated brew installs to one line
This commit is contained in:
parent
2665f3c28e
commit
e5d1ce4dde
1 changed files with 7 additions and 6 deletions
13
README.md
13
README.md
|
@ -29,7 +29,8 @@ curl https://ollama.ai/install.sh | sh
|
||||||
|
|
||||||
### Docker
|
### Docker
|
||||||
|
|
||||||
See the official [Docker image](https://hub.docker.com/r/ollama/ollama).
|
The official [Ollama Docker image `ollama/ollama`](https://hub.docker.com/r/ollama/ollama)
|
||||||
|
is available on Docker Hub.
|
||||||
|
|
||||||
## Quickstart
|
## Quickstart
|
||||||
|
|
||||||
|
@ -178,8 +179,7 @@ ollama list
|
||||||
Install `cmake` and `go`:
|
Install `cmake` and `go`:
|
||||||
|
|
||||||
```
|
```
|
||||||
brew install cmake
|
brew install cmake go
|
||||||
brew install go
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Then generate dependencies and build:
|
Then generate dependencies and build:
|
||||||
|
@ -203,9 +203,8 @@ Finally, in a separate shell, run a model:
|
||||||
|
|
||||||
## REST API
|
## REST API
|
||||||
|
|
||||||
See the [API documentation](docs/api.md) for all endpoints.
|
Ollama has a REST API for running and managing models.
|
||||||
|
For example, to generate text from a model:
|
||||||
Ollama has an API for running and managing models. For example to generate text from a model:
|
|
||||||
|
|
||||||
```
|
```
|
||||||
curl -X POST http://localhost:11434/api/generate -d '{
|
curl -X POST http://localhost:11434/api/generate -d '{
|
||||||
|
@ -214,6 +213,8 @@ curl -X POST http://localhost:11434/api/generate -d '{
|
||||||
}'
|
}'
|
||||||
```
|
```
|
||||||
|
|
||||||
|
See the [API documentation](./docs/api.md) for all endpoints.
|
||||||
|
|
||||||
## Community Integrations
|
## Community Integrations
|
||||||
|
|
||||||
- [LangChain](https://python.langchain.com/docs/integrations/llms/ollama) and [LangChain.js](https://js.langchain.com/docs/modules/model_io/models/llms/integrations/ollama) with [example](https://js.langchain.com/docs/use_cases/question_answering/local_retrieval_qa)
|
- [LangChain](https://python.langchain.com/docs/integrations/llms/ollama) and [LangChain.js](https://js.langchain.com/docs/modules/model_io/models/llms/integrations/ollama) with [example](https://js.langchain.com/docs/use_cases/question_answering/local_retrieval_qa)
|
||||||
|
|
Loading…
Reference in a new issue