Bug fixed with n_ctx=0 (#1015)
If the n_ctx is set to 0 the code should use the maximum context length of the selected model, but it didn't work. There was a problem with the initialization of this parameter and a related problem with 'n_batch'.
This commit is contained in:
parent
5a8944672f
commit
f1c631dc53
1 changed files with 6 additions and 0 deletions
|
@ -923,6 +923,12 @@ class Llama:
|
|||
self._model = _LlamaModel(
|
||||
path_model=self.model_path, params=self.model_params, verbose=self.verbose
|
||||
)
|
||||
# Set the default value for the context and correct the batch
|
||||
if n_ctx == 0:
|
||||
n_ctx = self._model.n_ctx_train()
|
||||
self.n_batch = min(n_ctx, n_batch)
|
||||
self.context_params.n_ctx = self._model.n_ctx_train()
|
||||
self.context_params.n_batch = self.n_batch
|
||||
|
||||
self._ctx = _LlamaContext(
|
||||
model=self._model,
|
||||
|
|
Loading…
Reference in a new issue