Skip to content

Commit

Permalink
Update max_tokens in constants.ts
Browse files Browse the repository at this point in the history
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed.
  • Loading branch information
fernsdavid25 authored Oct 21, 2024
1 parent cd4ddfd commit f761509
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion app/lib/.server/llm/constants.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
// see https://docs.anthropic.com/en/docs/about-claude/models
export const MAX_TOKENS = 8192;
export const MAX_TOKENS = 8000;

// limits the number of model responses that can be returned in a single request
export const MAX_RESPONSE_SEGMENTS = 2;

0 comments on commit f761509

Please sign in to comment.