From 02fe26c44b8c96744c7467c631c81af1bdf00921 Mon Sep 17 00:00:00 2001 From: Matt Williams Date: Thu, 7 Dec 2023 13:46:30 -0800 Subject: [PATCH] update the readme as per bruce Signed-off-by: Matt Williams --- examples/typescript-simplechat/readme.md | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/examples/typescript-simplechat/readme.md b/examples/typescript-simplechat/readme.md index 6c0a1f4b..ccd4aaf6 100644 --- a/examples/typescript-simplechat/readme.md +++ b/examples/typescript-simplechat/readme.md @@ -2,6 +2,14 @@ The **chat** endpoint is one of two ways to generate text from an LLM with Ollama. At a high level you provide the endpoint an array of message objects with a role and content specified. Then with each output and prompt, you add more messages, which builds up the history. +## Run the Example + +There are a few ways to run this, just like any Typescript code: + +1. Compile with `tsc` and then run it with `node client.js`. +2. Install `tsx` and run it with `tsx client.ts`. +3. Install `bun` and run it with `bun client.ts`. + ## Review the Code You can see in the **chat** function that is actually calling the endpoint is simply done with: