Cleanup as per Bruce

Signed-off-by: Matt Williams <m@technovangelist.com>
This commit is contained in:
Matt Williams 2023-12-06 15:44:40 -08:00
parent aec742b6d2
commit 76bc4d0458
2 changed files with 4 additions and 4 deletions

View file

@ -15,6 +15,8 @@ def chat(messages):
for line in r.iter_lines():
body = json.loads(line)
if "error" in body:
raise Exception(body["error"])
if body.get("done") is False:
message = body.get("message", "")
content = message.get("content", "")
@ -22,8 +24,6 @@ def chat(messages):
# the response streams one token at a time, print that as we receive it
print(content, end="", flush=True)
if "error" in body:
raise Exception(body["error"])
if body.get("done", False):
message["content"] = output
@ -32,7 +32,7 @@ def chat(messages):
def main():
messages = []
) # the context stores a conversation history, you can use this to make the model more context aware
while True:
user_input = input("Enter a prompt: ")
print()

View file

@ -21,4 +21,4 @@ In the **main** function, we collect `user_input` and add it as a message to our
## Next Steps
In this example, all generations are kept. You might want to experiment with summarizing everything older than 10 conversations to enable longer history with less context being used.
In this example, all generations are kept. You might want to experiment with summarizing everything older than 10 conversations to enable longer history with less context being used.