Logo

openai

#10524

parameter_error

Unsupported parameter: the specified parameter is not supported with this model. Use the correct alternative parameter as instructed.

This error has been identified and solved.

Reason

The error you are encountering is due to the deprecation of the max_tokens parameter in favor of max_completion_tokens for certain OpenAI models, particularly the newer "o1" series models.

  • The max_tokens parameter is no longer supported for the "o1" series models (such as o1-preview and o1-mini), as it does not account for the internal reasoning tokens generated by these models.

  • The new max_completion_tokens parameter is designed to control the total number of tokens generated, including both visible completion tokens and internal reasoning tokens. This is necessary because the total tokens generated by the o1 series models can exceed the number of visible tokens due to these reasoning tokens.

In contrast, older models like GPT-4 still use the max_tokens parameter, which is why you might see inconsistencies depending on the model you are using.

Solution

To fix the error, you need to use the correct parameter based on the OpenAI model you are utilizing.

For o1 series models (e.g., o1-preview, o1-mini):

  • Use max_completion_tokens instead of max_tokens.

For GPT-4 models:

  • Continue using max_tokens as max_completion_tokens is not compatible with these models.

Ensure you update your API requests accordingly to match the model's requirements.

Original Error Message

Raw

Unsupported parameter: the specified parameter is not supported with this model. Use the correct alternative parameter as instructed.

Original Error Message

Raw

Unsupported parameter: the specified parameter is not supported with this model. Use the correct alternative parameter as instructed.

© 2024 Portkey, Inc. All rights reserved

HIPAA

COMPLIANT

GDPR

Transform Your AI app prototypes to be production-ready

Talk to us →