llama_cpp server: fix to ChatCompletionRequestMessage

When I generate a client, it breaks because it fails to process the schema of ChatCompletionRequestMessage

These fix that:
- I think `Union[Literal["user"], Literal["channel"], ...]` is the same as Literal["user", "channel", ...]
- Turns out default value `Literal["user"]` isn't JSON serializable, so replace with "user"
This commit is contained in:
Lucas Doyle 2023-05-01 11:48:37 -07:00
parent fa2a61e065
commit dbbfc4ba2f
2 changed files with 3 additions and 3 deletions

View file

@ -58,7 +58,7 @@ class Completion(TypedDict):
class ChatCompletionMessage(TypedDict): class ChatCompletionMessage(TypedDict):
role: Union[Literal["assistant"], Literal["user"], Literal["system"]] role: Literal["assistant", "user", "system"]
content: str content: str
class ChatCompletionChoice(TypedDict): class ChatCompletionChoice(TypedDict):

View file

@ -215,8 +215,8 @@ def create_embedding(
class ChatCompletionRequestMessage(BaseModel): class ChatCompletionRequestMessage(BaseModel):
role: Union[Literal["system"], Literal["user"], Literal["assistant"]] = Field( role: Literal["system", "user", "assistant"] = Field(
default=Literal["user"], description="The role of the message." default="user", description="The role of the message."
) )
content: str = Field(default="", description="The content of the message.") content: str = Field(default="", description="The content of the message.")